Skip to content

unienv_data

BatchT module-attribute

BatchT = TypeVar('BatchT')

SamplerBatchT module-attribute

SamplerBatchT = TypeVar('SamplerBatchT')

SamplerArrayType module-attribute

SamplerArrayType = TypeVar('SamplerArrayType')

SamplerDeviceType module-attribute

SamplerDeviceType = TypeVar('SamplerDeviceType')

SamplerDtypeType module-attribute

SamplerDtypeType = TypeVar('SamplerDtypeType')

SamplerRNGType module-attribute

SamplerRNGType = TypeVar('SamplerRNGType')

IndexableType module-attribute

IndexableType = Union[int, slice, EllipsisType]

BatchBase

BatchBase(single_space: Space[BatchT, BDeviceType, BDtypeType, BRNGType], single_metadata_space: Optional[DictSpace[BDeviceType, BDtypeType, BRNGType]] = None)

Bases: ABC, Generic[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Base interface for dataset-like collections of structured samples.

BatchBase stores the per-sample space definition and provides a uniform API for retrieving either structured values or their flattened backend-array representation. Concrete subclasses decide where data lives and which mutation operations are supported.

Initialize the batch contract for one logical sample.

is_mutable class-attribute instance-attribute

is_mutable: bool = True

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

get_flattened_at

get_flattened_at(idx: Union[IndexableType, BArrayType]) -> BArrayType

Fetch samples as flattened backend arrays.

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BArrayType, Optional[Dict[str, Any]]]

Fetch flattened samples together with optional per-sample metadata.

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

Overwrite existing samples using flattened data.

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

extend_flattened

extend_flattened(value: BArrayType) -> None

Append a batched block of flattened samples.

get_at

get_at(idx: Union[IndexableType, BArrayType]) -> BatchT

Fetch samples in their structured form.

get_at_with_metadata

get_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BatchT, Optional[Dict[str, Any]]]

Fetch structured samples together with optional metadata.

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

Overwrite existing samples using structured data.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend

extend(value: BatchT) -> None

Append a batched block of structured samples.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

close

close() -> None

BatchSampler

BatchSampler(single_space: Space[BatchT, BDeviceType, BDtypeType, BRNGType], single_metadata_space: Optional[DictSpace[BDeviceType, BDtypeType, BRNGType]] = None, batch_size: int = 1)

Bases: Generic[SamplerBatchT, SamplerArrayType, SamplerDeviceType, SamplerDtypeType, SamplerRNGType, BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], BatchBase[SamplerBatchT, SamplerArrayType, SamplerDeviceType, SamplerDtypeType, SamplerRNGType]

Read-only batch wrapper that samples mini-batches from another batch.

Configure the sampling batch shape without binding the source data yet.

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

data instance-attribute

data: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

rng class-attribute instance-attribute

rng: Optional[SamplerRNGType] = None

data_rng class-attribute instance-attribute

data_rng: Optional[BRNGType] = None

is_mutable class-attribute instance-attribute

is_mutable: bool = False

batch_size instance-attribute

batch_size = batch_size

sampled_space property

sampled_space: Space[SamplerBatchT, SamplerDeviceType, SamplerDtypeType, SamplerRNGType]

sampled_metadata_space property

sampled_metadata_space: Optional[DictSpace[SamplerDeviceType, SamplerDtypeType, SamplerRNGType]]

get_flattened_at

get_flattened_at(idx: Union[IndexableType, BArrayType]) -> BArrayType

Fetch samples as flattened backend arrays.

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BArrayType, Optional[Dict[str, Any]]]

Fetch flattened samples together with optional per-sample metadata.

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

Overwrite existing samples using flattened data.

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

extend_flattened

extend_flattened(value: BArrayType) -> None

Append a batched block of flattened samples.

get_at

get_at(idx: Union[IndexableType, BArrayType]) -> BatchT

Fetch samples in their structured form.

get_at_with_metadata

get_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BatchT, Optional[Dict[str, Any]]]

Fetch structured samples together with optional metadata.

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

Overwrite existing samples using structured data.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend

extend(value: BatchT) -> None

Append a batched block of structured samples.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

manual_seed

manual_seed(seed: int) -> None

Reset sampler RNG state, including the optional data-index RNG.

sample_index

sample_index() -> SamplerArrayType

Draw random indices for one batch.

sample_flat

sample_flat() -> SamplerArrayType

Sample a batch and return it in flattened form.

sample_flat_with_metadata

sample_flat_with_metadata() -> Tuple[SamplerArrayType, Optional[Dict[str, Any]]]

Sample a flattened batch together with optional metadata.

sample

sample() -> SamplerBatchT

Sample a batch in its structured form.

sample_with_metadata

sample_with_metadata() -> Tuple[SamplerBatchT, Optional[Dict[str, Any]]]

Sample a structured batch together with optional metadata.

epoch_iter

epoch_iter() -> Iterator[SamplerBatchT]

Iterate once over a random permutation of the source data.

epoch_iter_with_metadata

epoch_iter_with_metadata() -> Iterator[Tuple[SamplerBatchT, Optional[Dict[str, Any]]]]

Iterate once over shuffled structured batches plus metadata.

epoch_flat_iter

epoch_flat_iter() -> Iterator[SamplerArrayType]

Iterate once over shuffled batches in flattened form.

epoch_flat_iter_with_metadata

epoch_flat_iter_with_metadata() -> Iterator[Tuple[SamplerArrayType, Optional[Dict[str, Any]]]]

Iterate once over shuffled flattened batches plus metadata.

close

close() -> None

SpaceStorage

SpaceStorage(single_instance_space: Space[BatchT, BDeviceType, BDtypeType, BRNGType])

Bases: ABC, Generic[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

SpaceStorage is an abstract base class for storages that hold instances of a specific space. It provides a common interface for creating, loading, and managing the storage of instances of a given space. Note that if you want your space storage to support multiprocessing, you need to check / implement __getstate__ and __setstate__ methods to ensure that the storage can be pickled and unpickled correctly.

Bind the storage to the space of one stored element.

single_file_ext class-attribute instance-attribute

single_file_ext: Optional[str] = None

The total capacity (number of single instances) of the storage. If None, the storage has unlimited capacity.

capacity class-attribute instance-attribute

capacity: Optional[int] = None

The cache path for the storage. If None, the storage will not use caching.

cache_filename class-attribute instance-attribute

cache_filename: Optional[Union[str, PathLike]] = None

Can the storage instance be safely used in multiprocessing environments after creation? If True, the storage can be used in multiprocessing environments.

is_multiprocessing_safe class-attribute instance-attribute

is_multiprocessing_safe: bool = False

Is the storage mutable? If False, the storage is read-only.

is_mutable class-attribute instance-attribute

is_mutable: bool = True

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

single_instance_space instance-attribute

single_instance_space = single_instance_space

create classmethod

create(single_instance_space: Space[BatchT, BDeviceType, BDtypeType, BRNGType], *args, capacity: Optional[int], cache_path: Optional[Union[str, PathLike]] = None, multiprocessing: bool = False, **kwargs) -> SpaceStorage[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a new storage instance for samples from single_instance_space.

load_from classmethod

load_from(path: Union[str, PathLike], single_instance_space: Space[BatchT, BDeviceType, BDtypeType, BRNGType], *, capacity: Optional[int] = None, read_only: bool = True, multiprocessing: bool = False, **kwargs) -> SpaceStorage[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Restore a storage instance previously written to disk.

extend_length

extend_length(length: int) -> None

This is used by capacity = None storages to extend the length of the storage If this is called on a storage with a fixed capacity, we will simply ignore the call.

shrink_length

shrink_length(length: int) -> None

This is used by capacity = None storages to shrink the length of the storage If this is called on a storage with a fixed capacity, we will simply ignore the call.

get abstractmethod

get(index: Union[IndexableType, BArrayType]) -> BatchT

Read one or more samples from storage.

set abstractmethod

set(index: Union[IndexableType, BArrayType], value: BatchT) -> None

Write one or more samples into storage.

clear

clear() -> None

Clear all data inside the storage and set the length to 0 if the storage has unlimited capacity. For storages with fixed capacity, this should reset the storage to its initial state.

dumps abstractmethod

dumps(path: Union[str, PathLike]) -> None

Dumps the storage to the specified path. This is used for storages that have a single file extension (e.g. .pt for PyTorch).

close

close() -> None

ToBackendOrDeviceBatch

ToBackendOrDeviceBatch(batch: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], backend: Optional[ComputeBackend[WrapperBArrayT, WrapperBDeviceT, WrapperBDtypeT, WrapperBRngT]] = None, device: Optional[WrapperBDeviceT] = None)

Bases: Generic[WrapperBatchT, WrapperBArrayT, WrapperBDeviceT, WrapperBDtypeT, WrapperBRngT, BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], BatchBase[WrapperBatchT, WrapperBArrayT, WrapperBDeviceT, WrapperBDtypeT, WrapperBRngT]

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

batch instance-attribute

batch = batch

target_backend instance-attribute

target_backend = backend

target_device instance-attribute

target_device = device

is_mutable property

is_mutable: bool

backend property

backend: ComputeBackend[WrapperBArrayT, WrapperBDeviceT, WrapperBDtypeT, WrapperBRngT]

device property

device: Optional[WrapperBDeviceT]

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

set_flattened_at

set_flattened_at(idx, value)

extend_flattened

extend_flattened(value)

get_at

get_at(idx)

get_at_with_metadata

get_at_with_metadata(idx)

set_at

set_at(idx, value)

extend

extend(value)

remove_at

remove_at(idx)

close

close()

CombinedBatch

CombinedBatch(batches: Sequence[BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]], device: Optional[BDeviceType] = None)

Bases: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

is_mutable instance-attribute

is_mutable = is_mutable

batches instance-attribute

batches = batches

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

get_flattened_at

get_flattened_at(idx: Union[IndexableType, BaseException]) -> BArrayType

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx: Union[IndexableType, BaseException]) -> Tuple[BArrayType, Dict[str, Any]]

get_at

get_at(idx: Union[IndexableType, BArrayType]) -> BatchT

get_at_with_metadata

get_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BatchT, Dict[str, Any]]

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

extend_flattened

extend_flattened(value: BArrayType) -> None

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

extend

extend(value: BatchT) -> None

close

close() -> None

SliceStackedBatch

SliceStackedBatch(batch: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], fixed_offset: BArrayType, get_valid_mask_function: Optional[Callable[[SliceStackedBatch, BArrayType, BatchT, Dict[str, Any]], BArrayType]] = None, fill_invalid_data: bool = True, stack_metadata: bool = False)

Bases: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

A batch that stacks frames with given fixed offsets. This is a read-only batch, since it is a view of the original batch. If you want to change the data, you should mutate the containing batch instead.

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

is_mutable class-attribute instance-attribute

is_mutable = False

batch instance-attribute

batch = batch

fixed_offset instance-attribute

fixed_offset = fixed_offset

fill_invalid_data instance-attribute

fill_invalid_data = fill_invalid_data

get_valid_mask_function instance-attribute

get_valid_mask_function = get_valid_mask_function

stack_metadata instance-attribute

stack_metadata = stack_metadata

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

Overwrite existing samples using flattened data.

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

extend_flattened

extend_flattened(value: BArrayType) -> None

Append a batched block of flattened samples.

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

Overwrite existing samples using structured data.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend

extend(value: BatchT) -> None

Append a batched block of structured samples.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

expand_index

expand_index(index: BArrayType) -> BArrayType

Expand indexes to slice the data Args: index (BArrayType): A 1D tensor of indices to expand. Can be boolean or integer. Returns: BArrayType: A 2D tensor of indices, where each row corresponds to the expanded indices for each batch item.

get_valid_mask_flattened

get_valid_mask_flattened(expanded_idx: BArrayType, data: BArrayType, metadata: Dict[str, Any]) -> BArrayType

Get the valid mask for the flattened data. Args: expanded_idx (B, T): A 2D tensor of indices to slice the data. data (B, T, *D): The data to slice. Returns: Valid Mask (B, T)

get_valid_mask

get_valid_mask(expanded_idx: BArrayType, data: BatchT, metadata: Dict[str, Any]) -> BArrayType

Get the valid mask for the data. Args: expanded_idx (B, T): A 2D tensor of indices to slice the data. data (BatchT): The data to slice. Returns: Valid Mask (B, T)

fill_data_with_stack_mask

fill_data_with_stack_mask(space: Optional[Space[Any, BDeviceType, BDtypeType, BRNGType]], data: Union[BArrayType, BatchT, Any], valid_mask: BArrayType) -> Union[BArrayType, BatchT, Any]

Fill the data with the mask as if the frames were frame-stacked. Args: space (Optional[Space]): The space to fill the data with. If None, the data is assumed to be a backend array. data (Union[BArrayType, BatchT, Any]): The data to fill sized (B, T, D) valid_mask (BArrayType): The mask to fill the data with, sized (B, T) Returns: Union[BArrayType, BatchT, Any]: The filled data, sized (B, T, D)

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

get_at

get_at(idx: Union[IndexableType, BArrayType]) -> BatchT

get_at_with_metadata

get_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BatchT, Dict[str, Any]]

close

close() -> None

get_valid_mask_function_with_episodeid_key staticmethod

get_valid_mask_function_with_episodeid_key(episode_id_key: Union[str, int] = 'episode_id', is_in_metadata: bool = False) -> Callable[[SliceStackedBatch, BArrayType, BatchT], BArrayType]

get_valid_mask_function_with_episode_end_key staticmethod

get_valid_mask_function_with_episode_end_key(episode_end_key: Union[str, int] = 'episode_end', is_in_metadata: bool = False) -> Callable[[SliceStackedBatch, BArrayType, BatchT], BArrayType]

FrameStackedBatch

FrameStackedBatch(batch: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], prefetch_horizon: int = 0, postfetch_horizon: int = 0, get_valid_mask_function: Optional[Callable[[SliceStackedBatch, BArrayType, BatchT, Dict[str, Any]], BArrayType]] = None, fill_invalid_data: bool = True, stack_metadata: bool = False)

Bases: SliceStackedBatch[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

A batch that stacks frames in a sliding window manner. This batch allows for prefetching and postfetching of frames, which can be useful for training models that require temporal context (e.g. Diffusion Policy, ACT, etc.) This is a read-only batch, since it is a view of the original batch. If you want to change the data, you should mutate the containing batch instead.

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

batch instance-attribute

batch = batch

fixed_offset instance-attribute

fixed_offset = fixed_offset

fill_invalid_data instance-attribute

fill_invalid_data = fill_invalid_data

get_valid_mask_function instance-attribute

get_valid_mask_function = get_valid_mask_function

stack_metadata instance-attribute

stack_metadata = stack_metadata

is_mutable class-attribute instance-attribute

is_mutable = False

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

Overwrite existing samples using flattened data.

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

extend_flattened

extend_flattened(value: BArrayType) -> None

Append a batched block of flattened samples.

get_at

get_at(idx: Union[IndexableType, BArrayType]) -> BatchT

get_at_with_metadata

get_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BatchT, Dict[str, Any]]

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

Overwrite existing samples using structured data.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend

extend(value: BatchT) -> None

Append a batched block of structured samples.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

close

close() -> None

expand_index

expand_index(index: BArrayType) -> BArrayType

Expand indexes to slice the data Args: index (BArrayType): A 1D tensor of indices to expand. Can be boolean or integer. Returns: BArrayType: A 2D tensor of indices, where each row corresponds to the expanded indices for each batch item.

get_valid_mask_flattened

get_valid_mask_flattened(expanded_idx: BArrayType, data: BArrayType, metadata: Dict[str, Any]) -> BArrayType

Get the valid mask for the flattened data. Args: expanded_idx (B, T): A 2D tensor of indices to slice the data. data (B, T, *D): The data to slice. Returns: Valid Mask (B, T)

get_valid_mask

get_valid_mask(expanded_idx: BArrayType, data: BatchT, metadata: Dict[str, Any]) -> BArrayType

Get the valid mask for the data. Args: expanded_idx (B, T): A 2D tensor of indices to slice the data. data (BatchT): The data to slice. Returns: Valid Mask (B, T)

fill_data_with_stack_mask

fill_data_with_stack_mask(space: Optional[Space[Any, BDeviceType, BDtypeType, BRNGType]], data: Union[BArrayType, BatchT, Any], valid_mask: BArrayType) -> Union[BArrayType, BatchT, Any]

Fill the data with the mask as if the frames were frame-stacked. Args: space (Optional[Space]): The space to fill the data with. If None, the data is assumed to be a backend array. data (Union[BArrayType, BatchT, Any]): The data to fill sized (B, T, D) valid_mask (BArrayType): The mask to fill the data with, sized (B, T) Returns: Union[BArrayType, BatchT, Any]: The filled data, sized (B, T, D)

get_valid_mask_function_with_episodeid_key staticmethod

get_valid_mask_function_with_episodeid_key(episode_id_key: Union[str, int] = 'episode_id', is_in_metadata: bool = False) -> Callable[[SliceStackedBatch, BArrayType, BatchT], BArrayType]

get_valid_mask_function_with_episode_end_key staticmethod

get_valid_mask_function_with_episode_end_key(episode_end_key: Union[str, int] = 'episode_end', is_in_metadata: bool = False) -> Callable[[SliceStackedBatch, BArrayType, BatchT], BArrayType]

SubIndexedBatch

SubIndexedBatch(batch: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], sub_indexes: Union[IndexableType, BArrayType])

Bases: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

A batch that offers a view of the original batch with a fixed index range. This does not support batch extension.

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

batch instance-attribute

batch = batch

sub_indexes instance-attribute

sub_indexes = sub_indexes

is_mutable property

is_mutable: bool

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

get_at

get_at(idx)

get_at_with_metadata

get_at_with_metadata(idx) -> Tuple[BatchT, Dict[str, Any]]

set_at

set_at(idx, value)

set_flattened_at

set_flattened_at(idx, value)

extend

extend(value)

extend_flattened

extend_flattened(value)

close

close() -> None

SubItemBatch

SubItemBatch(batch: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], sub_indexes: Sequence[Any])

Bases: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

A batch that offers a view of the original batch with a fixed index. This is a read-only batch, since it is a view of the original batch. If you want to change the data, you should mutate the containing batch instead.

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

is_mutable class-attribute instance-attribute

is_mutable = False

batch instance-attribute

batch = batch

sub_indexes instance-attribute

sub_indexes = sub_indexes

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

Overwrite existing samples using flattened data.

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

extend_flattened

extend_flattened(value: BArrayType) -> None

Append a batched block of flattened samples.

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

Overwrite existing samples using structured data.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend

extend(value: BatchT) -> None

Append a batched block of structured samples.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

get_at

get_at(idx)

get_at_with_metadata

get_at_with_metadata(idx) -> Tuple[BatchT, Dict[str, Any]]

close

close() -> None

TransformedBatch

TransformedBatch(batch: BatchBase[SourceDataT, SourceBArrT, SourceBDeviceT, SourceBDTypeT, SourceBDRNGT], transformation: DataTransformation, metadata_transformation: Optional[DataTransformation] = None)

Bases: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

batch instance-attribute

batch = batch

transformation instance-attribute

transformation = transformation

metadata_transformation instance-attribute

metadata_transformation = metadata_transformation if single_metadata_space is not None else None

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

is_mutable property

is_mutable: bool

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

get_flattened_at

get_flattened_at(idx: Union[IndexableType, BArrayType]) -> BArrayType

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BArrayType, Optional[Dict[str, Any]]]

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

extend_flattened

extend_flattened(value: BArrayType) -> None

get_at

get_at(idx: Union[IndexableType, BArrayType] = None) -> BatchT

get_at_with_metadata

get_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> Tuple[BatchT, Optional[Dict[str, Any]]]

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

extend

extend(value: BatchT) -> None

close

close()

StepSampler

StepSampler(data: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], batch_size: int, seed: Optional[int] = None, device: Optional[BDeviceType] = None)

Bases: BatchSampler[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType, BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

is_mutable class-attribute instance-attribute

is_mutable: bool = False

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

rng class-attribute instance-attribute

rng: Optional[SamplerRNGType] = None

sampled_space property

sampled_space: Space[SamplerBatchT, SamplerDeviceType, SamplerDtypeType, SamplerRNGType]

sampled_metadata_space property

sampled_metadata_space: Optional[DictSpace[SamplerDeviceType, SamplerDtypeType, SamplerRNGType]]

data instance-attribute

data = data

batch_size instance-attribute

batch_size = batch_size

data_rng instance-attribute

data_rng = random_number_generator(seed, device=device)

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

Overwrite existing samples using flattened data.

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

extend_flattened

extend_flattened(value: BArrayType) -> None

Append a batched block of flattened samples.

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

Overwrite existing samples using structured data.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend

extend(value: BatchT) -> None

Append a batched block of structured samples.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

close

close() -> None

manual_seed

manual_seed(seed: int) -> None

Reset sampler RNG state, including the optional data-index RNG.

sample_index

sample_index() -> SamplerArrayType

Draw random indices for one batch.

sample_flat

sample_flat() -> SamplerArrayType

Sample a batch and return it in flattened form.

sample_flat_with_metadata

sample_flat_with_metadata() -> Tuple[SamplerArrayType, Optional[Dict[str, Any]]]

Sample a flattened batch together with optional metadata.

sample

sample() -> SamplerBatchT

Sample a batch in its structured form.

sample_with_metadata

sample_with_metadata() -> Tuple[SamplerBatchT, Optional[Dict[str, Any]]]

Sample a structured batch together with optional metadata.

epoch_iter

epoch_iter() -> Iterator[SamplerBatchT]

Iterate once over a random permutation of the source data.

epoch_iter_with_metadata

epoch_iter_with_metadata() -> Iterator[Tuple[SamplerBatchT, Optional[Dict[str, Any]]]]

Iterate once over shuffled structured batches plus metadata.

epoch_flat_iter

epoch_flat_iter() -> Iterator[SamplerArrayType]

Iterate once over shuffled batches in flattened form.

epoch_flat_iter_with_metadata

epoch_flat_iter_with_metadata() -> Iterator[Tuple[SamplerArrayType, Optional[Dict[str, Any]]]]

Iterate once over shuffled flattened batches plus metadata.

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

get_at

get_at(idx)

get_at_with_metadata

get_at_with_metadata(idx)

MultiprocessingSampler

MultiprocessingSampler(sampler: BatchSampler[SamplerBatchT, SamplerArrayType, SamplerDeviceType, SamplerDtypeType, SamplerRNGType, BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], n_workers: int = 4, n_buffers: int = 8, ctx: Optional[BaseContext] = None, doze_time: float = 0.005, daemon: Optional[bool] = None)

Bases: BatchSampler[SamplerBatchT, SamplerArrayType, SamplerDeviceType, SamplerDtypeType, SamplerRNGType, BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

is_mutable class-attribute instance-attribute

is_mutable: bool = False

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

batch_size instance-attribute

batch_size = batch_size

mp_ctx instance-attribute

mp_ctx = ctx

sampler instance-attribute

sampler = sampler

closed instance-attribute

closed = False

n_workers instance-attribute

n_workers = n_workers

n_buffers instance-attribute

n_buffers = n_buffers

functions_to_close instance-attribute

functions_to_close = []

sampled_space property

sampled_space

sampled_metadata_space property

sampled_metadata_space

backend property

backend

device property

device

data property

data

rng property writable

rng

data_rng property writable

data_rng

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

Overwrite existing samples using flattened data.

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

extend_flattened

extend_flattened(value: BArrayType) -> None

Append a batched block of flattened samples.

set_at

set_at(idx: Union[IndexableType, BArrayType], value: BatchT) -> None

Overwrite existing samples using structured data.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend

extend(value: BatchT) -> None

Append a batched block of structured samples.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

sample_index

sample_index() -> SamplerArrayType

Draw random indices for one batch.

sample_flat

sample_flat() -> SamplerArrayType

Sample a batch and return it in flattened form.

sample_flat_with_metadata

sample_flat_with_metadata() -> Tuple[SamplerArrayType, Optional[Dict[str, Any]]]

Sample a flattened batch together with optional metadata.

sample

sample() -> SamplerBatchT

Sample a batch in its structured form.

sample_with_metadata

sample_with_metadata() -> Tuple[SamplerBatchT, Optional[Dict[str, Any]]]

Sample a structured batch together with optional metadata.

epoch_iter

epoch_iter() -> Iterator[SamplerBatchT]

Iterate once over a random permutation of the source data.

epoch_iter_with_metadata

epoch_iter_with_metadata() -> Iterator[Tuple[SamplerBatchT, Optional[Dict[str, Any]]]]

Iterate once over shuffled structured batches plus metadata.

epoch_flat_iter

epoch_flat_iter() -> Iterator[SamplerArrayType]

Iterate once over shuffled batches in flattened form.

epoch_flat_iter_with_metadata

epoch_flat_iter_with_metadata() -> Iterator[Tuple[SamplerArrayType, Optional[Dict[str, Any]]]]

Iterate once over shuffled flattened batches plus metadata.

manual_seed

manual_seed(seed)

get_at

get_at(idx)

get_at_with_metadata

get_at_with_metadata(idx)

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

close

close()

ReplayBuffer

ReplayBuffer(storage: SpaceStorage[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], storage_path_relative: str, count: int = 0, offset: int = 0, cache_path: Optional[Union[str, PathLike]] = None, multiprocessing: bool = False)

Bases: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Ring-buffer style batch storage built on top of a SpaceStorage.

The buffer exposes the standard BatchBase interface while handling round-robin overwrite semantics, persistence metadata, and optional multiprocessing-safe counters.

Wrap a storage object with replay-buffer indexing semantics.

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

storage instance-attribute

storage = storage

cache_path property

cache_path: Optional[Union[str, PathLike]]

storage_path_relative property

storage_path_relative: str

count property writable

count: int

offset property writable

offset: int

capacity property

capacity: Optional[int]

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

is_mutable property

is_mutable: bool

is_multiprocessing_safe property

is_multiprocessing_safe: bool

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

create staticmethod

create(storage_cls: Type[SpaceStorage[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]], single_instance_space: Space[BatchT, BDeviceType, BDtypeType, BRNGType], *args, cache_path: Optional[Union[str, PathLike]] = None, capacity: Optional[int] = None, multiprocessing: bool = False, **kwargs) -> ReplayBuffer[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a new replay buffer and its backing storage.

is_loadable_from staticmethod

is_loadable_from(path: Union[str, PathLike]) -> bool

get_length_from_path staticmethod

get_length_from_path(path: Union[str, PathLike]) -> Optional[int]

get_capacity_from_path staticmethod

get_capacity_from_path(path: Union[str, PathLike]) -> Optional[int]

get_space_from_path staticmethod

get_space_from_path(path: Union[str, PathLike], *, backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType], device: Optional[BDeviceType] = None) -> Optional[Space[BatchT, BDeviceType, BDtypeType, BRNGType]]

load_from staticmethod

load_from(path: Union[str, PathLike], *, backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType], device: Optional[BDeviceType] = None, read_only: bool = True, multiprocessing: bool = False, **storage_kwargs) -> ReplayBuffer[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Load a replay buffer plus its backing storage from disk.

dumps

dumps(path: Union[str, PathLike])

Persist the replay buffer metadata and its backing storage.

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx: Union[IndexableType, BArrayType]) -> BArrayType

get_at

get_at(idx)

get_at_with_metadata

get_at_with_metadata(idx)

set_flattened_at

set_flattened_at(idx: Union[IndexableType, BArrayType], value: BArrayType) -> None

set_at

set_at(idx, value)

extend_flattened

extend_flattened(value: BArrayType)

extend

extend(value)

get_column

get_column(nested_keys)

clear

clear()

close

close() -> None

TrajectoryReplayBuffer

TrajectoryReplayBuffer(step_data_buffer: ReplayBuffer[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], step_episode_id_buffer: ReplayBuffer[BArrayType, BArrayType, BDeviceType, BDtypeType, BRNGType], episode_data_buffer: Optional[ReplayBuffer[EpisodeBatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]], current_episode_id: int = 0, episode_id_to_index_map: Optional[Dict[int, int]] = None, multiprocessing: bool = False)

Bases: BatchBase[TrajectoryData[BatchT, EpisodeBatchT], BArrayType, BDeviceType, BDtypeType, BRNGType], Generic[BatchT, EpisodeBatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Replay buffer that tracks both step-level and episode-level data.

Each transition stores step data plus an episode id, and can optionally reference separate episode metadata for trajectory-aware sampling.

Assemble the step buffer, episode id buffer, and optional episode buffer.

single_space instance-attribute

single_space = single_space

single_metadata_space instance-attribute

single_metadata_space = single_metadata_space

step_data_buffer instance-attribute

step_data_buffer = step_data_buffer

step_episode_id_buffer instance-attribute

step_episode_id_buffer = step_episode_id_buffer

episode_data_buffer instance-attribute

episode_data_buffer = episode_data_buffer

episode_id_to_index_map instance-attribute

episode_id_to_index_map = dict(episode_id_to_index_map) if episode_id_to_index_map is not None else None

capacity property

capacity: Optional[int]

episode_capacity property

episode_capacity: Optional[int]

backend property

backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType]

device property

device: Optional[BDeviceType]

is_mutable property

is_mutable: bool

is_multiprocessing_safe property

is_multiprocessing_safe: bool

current_episode_id property writable

current_episode_id: int

append_flattened

append_flattened(value: BArrayType) -> None

Append one flattened sample to the batch.

remove_at

remove_at(idx: Union[IndexableType, BArrayType]) -> None

Remove one or more samples from the batch.

append

append(value: BatchT) -> None

Append one structured sample to the batch.

extend_from

extend_from(other: BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType], chunk_size: int = 8, tqdm: bool = False) -> None

Copy data from another batch in bounded-size chunks.

get_slice

get_slice(idx: Union[IndexableType, BArrayType]) -> BatchBase[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a subset of indices.

get_column

get_column(nested_keys: Sequence[str]) -> BatchBase[Any, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a lazy view over a nested field inside each sample.

create staticmethod

create(step_data_instance_space: Space[BatchT, BDeviceType, BDtypeType, BRNGType], step_data_storage_cls: Type[SpaceStorage[BatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]], *, step_data_capacity: Optional[int] = None, step_episode_id_storage_cls: Optional[Type[SpaceStorage[BArrayType, BArrayType, BDeviceType, BDtypeType, BRNGType]]] = None, step_episode_id_capacity: Optional[int] = None, step_episode_id_storage_kwargs: Dict[str, Any] = {}, episode_data_instance_space: Optional[Space[EpisodeBatchT, BDeviceType, BDtypeType, BRNGType]] = None, episode_data_storage_cls: Optional[Type[SpaceStorage[EpisodeBatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]]] = None, episode_data_capacity: Optional[int] = None, episode_data_storage_kwargs: Dict[str, Any] = {}, cache_path: Optional[Union[str, PathLike]] = None, multiprocessing: bool = False, **kwargs) -> TrajectoryReplayBuffer[BatchT, EpisodeBatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Create a new trajectory replay buffer and all subordinate buffers.

is_loadable_from staticmethod

is_loadable_from(path: Union[str, PathLike]) -> bool

load_from staticmethod

load_from(path: Union[str, PathLike], *, backend: ComputeBackend[BArrayType, BDeviceType, BDtypeType, BRNGType], device: Optional[BDeviceType] = None, read_only: bool = False, multiprocessing: bool = False, step_storage_kwargs: Dict[str, Any] = {}, step_episode_id_storage_kwargs: Dict[str, Any] = {}, episode_storage_kwargs: Dict[str, Any] = {}, **storage_kwargs) -> TrajectoryReplayBuffer[BatchT, EpisodeBatchT, BArrayType, BDeviceType, BDtypeType, BRNGType]

Load a trajectory replay buffer and its subordinate buffers from disk.

dumps

dumps(path: Union[str, PathLike])

Persist buffer metadata plus all subordinate replay buffers.

episode_id_to_episode_data_index

episode_id_to_episode_data_index(episode_id: Union[int, BArrayType]) -> Union[int, BArrayType]

get_flattened_at

get_flattened_at(idx)

get_flattened_at_with_metadata

get_flattened_at_with_metadata(idx)

get_at

get_at(idx)

get_at_with_metadata

get_at_with_metadata(idx)

set_flattened_at

set_flattened_at(idx, value)

Set the flattened data at the specified index.

set_at

set_at(idx, value)

set_episode_data_at

set_episode_data_at(episode_id: Union[int, BArrayType], value: Any) -> None

extend_flattened

extend_flattened(value)

extend

extend(value)

set_current_episode_data

set_current_episode_data(value: EpisodeBatchT) -> None

mark_episode_end

mark_episode_end() -> None

clear

clear()

close

close()