Skip to content

Tensor

batchtensor.tensor

Contain functions to manipulate tensors.

batchtensor.tensor.amax_along_batch

amax_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the maximum of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The maximum of all elements along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import amax_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = amax_along_batch(tensor)
>>> out
tensor([8, 9])
>>> out = amax_along_batch(tensor, keepdim=True)
>>> out
tensor([[8, 9]])

batchtensor.tensor.amax_along_seq

amax_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the maximum of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The maximum of all elements along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import amax_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = amax_along_seq(tensor)
>>> out
tensor([4, 9])
>>> out = amax_along_seq(tensor, keepdim=True)
>>> out
tensor([[4], [9]])

batchtensor.tensor.amin_along_batch

amin_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the minimum of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The minimum of all elements along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import amin_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = amin_along_batch(tensor)
>>> out
tensor([0, 1])
>>> out = amin_along_batch(tensor, keepdim=True)
>>> out
tensor([[0, 1]])

batchtensor.tensor.amin_along_seq

amin_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the minimum of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The minimum of all elements along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import amin_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = amin_along_seq(tensor)
>>> out
tensor([0, 5])
>>> out = amin_along_seq(tensor, keepdim=True)
>>> out
tensor([[0], [5]])

batchtensor.tensor.argmax_along_batch

argmax_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the indices of the maximum value of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The indices of the maximum value of all elements along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import argmax_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = argmax_along_batch(tensor)
>>> out
tensor([4, 4])
>>> out = argmax_along_batch(tensor, keepdim=True)
>>> out
tensor([[4, 4]])

batchtensor.tensor.argmax_along_seq

argmax_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the indices of the maximum value of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The indices of the maximum value of all elements along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import argmax_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = argmax_along_seq(tensor)
>>> out
tensor([4, 4])
>>> out = argmax_along_seq(tensor, keepdim=True)
>>> out
tensor([[4], [4]])

batchtensor.tensor.argmin_along_batch

argmin_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the indices of the minimum value of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The indices of the minimum value of all elements along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import argmin_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = argmin_along_batch(tensor)
>>> out
tensor([0, 0])
>>> out = argmin_along_batch(tensor, keepdim=True)
>>> out
tensor([[0, 0]])

batchtensor.tensor.argmin_along_seq

argmin_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the indices of the minimum value of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The indices of the minimum value of all elements along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import argmin_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = argmin_along_seq(tensor)
>>> out
tensor([0, 0])
>>> out = argmin_along_seq(tensor, keepdim=True)
>>> out
tensor([[0], [0]])

batchtensor.tensor.argsort_along_batch

argsort_along_batch(
    tensor: Tensor, descending: bool = False, **kwargs: Any
) -> Tensor

Return the indices that sort a tensor along the batch dimension in ascending order by value.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
descending bool

Controls the sorting order (ascending or descending).

False
kwargs Any

Additional keywords arguments for torch.argsort.

{}

Returns:

Type Description
Tensor

The indices that sort a tensor along the batch dimension

Example usage:

>>> import torch
>>> from batchtensor.tensor import argsort_along_batch
>>> tensor = torch.tensor([[2, 6], [0, 3], [4, 9], [8, 1], [5, 7]])
>>> out = argsort_along_batch(tensor)
>>> out
tensor([[1, 3], [0, 1], [2, 0], [4, 4], [3, 2]])
>>> out = argsort_along_batch(tensor, descending=True)
>>> out
tensor([[3, 2], [4, 4], [2, 0], [0, 1], [1, 3]])

batchtensor.tensor.argsort_along_seq

argsort_along_seq(
    tensor: Tensor, descending: bool = False, **kwargs: Any
) -> Tensor

Return the indices that sort a tensor along the sequence dimension in ascending order by value.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
descending bool

Controls the sorting order (ascending or descending).

False
kwargs Any

Additional keywords arguments for torch.argsort.

{}

Returns:

Type Description
Tensor

The indices that sort a tensor along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import argsort_along_seq
>>> tensor = torch.tensor([[7, 3, 0, 8, 5], [1, 9, 6, 4, 2]])
>>> out = argsort_along_seq(tensor)
>>> out
tensor([[2, 1, 4, 0, 3],
        [0, 4, 3, 2, 1]])
>>> out = argsort_along_seq(tensor, descending=True)
>>> out
tensor([[3, 0, 4, 1, 2],
        [1, 2, 3, 4, 0]])

batchtensor.tensor.cat_along_batch

cat_along_batch(
    tensors: list[Tensor] | tuple[Tensor, ...],
) -> Tensor

Concatenate the given tensors in the batch dimension.

All tensors must either have the same data type and shape (except in the concatenating dimension) or be empty.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensors list[Tensor] | tuple[Tensor, ...]

The tensors to concatenate.

required

Returns:

Type Description
Tensor

The concatenated tensors along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import cat_along_batch
>>> tensors = [
...     torch.tensor([[0, 1, 2], [4, 5, 6]]),
...     torch.tensor([[10, 11, 12], [13, 14, 15]]),
... ]
>>> out = cat_along_batch(tensors)
>>> out
tensor([[ 0,  1,  2],
        [ 4,  5,  6],
        [10, 11, 12],
        [13, 14, 15]])

batchtensor.tensor.cat_along_seq

cat_along_seq(
    tensors: list[Tensor] | tuple[Tensor, ...],
) -> Tensor

Concatenate the given tensors in the sequence dimension.

All tensors must either have the same data type and shape (except in the concatenating dimension) or be empty.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensors list[Tensor] | tuple[Tensor, ...]

The tensors to concatenate.

required

Returns:

Type Description
Tensor

The concatenated tensors along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import cat_along_seq
>>> tensors = [
...     torch.tensor([[0, 1, 2], [4, 5, 6]]),
...     torch.tensor([[10, 11], [12, 13]]),
... ]
>>> out = cat_along_seq(tensors)
>>> out
tensor([[ 0,  1,  2, 10, 11],
        [ 4,  5,  6, 12, 13]])

batchtensor.tensor.chunk_along_batch

chunk_along_batch(
    tensor: Tensor, chunks: int
) -> tuple[Tensor, ...]

Split the tensor into chunks along the batch dimension.

Each chunk is a view of the input tensor.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The tensor to split.

required
chunks int

Number of chunks to return.

required

Returns:

Type Description
tuple[Tensor, ...]

The tensor chunks.

Example usage:

>>> import torch
>>> from batchtensor.tensor import chunk_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> outputs = chunk_along_batch(tensor, chunks=3)
>>> outputs
(tensor([[0, 1], [2, 3]]),
 tensor([[4, 5], [6, 7]]),
 tensor([[8, 9]]))

batchtensor.tensor.chunk_along_seq

chunk_along_seq(
    tensor: Tensor, chunks: int
) -> tuple[Tensor, ...]

Split the tensor into chunks along the sequence dimension.

Each chunk is a view of the input tensor.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The tensor to split.

required
chunks int

Number of chunks to return.

required

Returns:

Type Description
tuple[Tensor, ...]

The tensor chunks.

Example usage:

>>> import torch
>>> from batchtensor.tensor import chunk_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> outputs = chunk_along_seq(tensor, chunks=3)
>>> outputs
(tensor([[0, 1], [5, 6]]),
 tensor([[2, 3], [7, 8]]),
 tensor([[4], [9]]))

batchtensor.tensor.cumprod_along_batch

cumprod_along_batch(tensor: Tensor) -> Tensor

Return the cumulative product of elements of input in the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required

Returns:

Type Description
Tensor

The cumulative product of elements of input in the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import cumprod_along_batch
>>> tensor = torch.tensor([[1, 2], [3, 4], [5, 6], [7, 8], [9, 10]])
>>> out = cumprod_along_batch(tensor)
>>> out
tensor([[   1,    2], [   3,    8], [  15,   48], [ 105,  384], [ 945, 3840]])

batchtensor.tensor.cumprod_along_seq

cumprod_along_seq(tensor: Tensor) -> Tensor

Return the cumulative product of elements of input in the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required

Returns:

Type Description
Tensor

The cumulative product of elements of input in the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import cumprod_along_seq
>>> tensor = torch.tensor([[1, 2, 3, 4, 5], [6, 7, 8, 9, 10]])
>>> out = cumprod_along_seq(tensor)
>>> out
tensor([[    1,     2,     6,    24,   120],
        [    6,    42,   336,  3024, 30240]])

batchtensor.tensor.cumsum_along_batch

cumsum_along_batch(tensor: Tensor) -> Tensor

Return the cumulative sum of elements of input in the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required

Returns:

Type Description
Tensor

The cumulative sum of elements of input in the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import cumsum_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = cumsum_along_batch(tensor)
>>> out
tensor([[ 0,  1], [ 2,  4], [ 6,  9], [12, 16], [20, 25]])

batchtensor.tensor.cumsum_along_seq

cumsum_along_seq(tensor: Tensor) -> Tensor

Return the cumulative sum of elements of input in the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required

Returns:

Type Description
Tensor

The cumulative sum of elements of input in the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import cumsum_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = cumsum_along_seq(tensor)
>>> out
tensor([[ 0,  1,  3,  6, 10],
        [ 5, 11, 18, 26, 35]])

batchtensor.tensor.index_select_along_batch

index_select_along_batch(
    tensor: Tensor, index: Tensor
) -> Tensor

Return a new tensor which indexes the input tensor along the batch dimension using the entries in index which is a LongTensor.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
index Tensor

The 1-D tensor containing the indices to index.

required

Returns:

Type Description
Tensor

The indexed tensor along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import index_select_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = index_select_along_batch(tensor, torch.tensor([2, 4]))
>>> out
tensor([[4, 5],
        [8, 9]])
>>> out = index_select_along_batch(tensor, torch.tensor([4, 3, 2, 1, 0]))
>>> out
tensor([[8, 9],
        [6, 7],
        [4, 5],
        [2, 3],
        [0, 1]])

batchtensor.tensor.index_select_along_seq

index_select_along_seq(
    tensor: Tensor, index: Tensor
) -> Tensor

Return a new tensor which indexes the input tensor along the sequence dimension using the entries in index which is a LongTensor.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
index Tensor

The 1-D tensor containing the indices to index.

required

Returns:

Type Description
Tensor

The indexed tensor along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import index_select_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = index_select_along_seq(tensor, torch.tensor([2, 4]))
>>> out
tensor([[2, 4],
        [7, 9]])
>>> out = index_select_along_seq(tensor, torch.tensor([4, 3, 2, 1, 0]))
>>> out
tensor([[4, 3, 2, 1, 0],
        [9, 8, 7, 6, 5]])

batchtensor.tensor.max_along_batch

max_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> max

Return the maximum of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
max

The first tensor will be populated with the maximum values and the second tensor, which must have dtype long, with their indices in the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import max_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = max_along_batch(tensor)
>>> out
torch.return_types.max(
values=tensor([8, 9]),
indices=tensor([4, 4]))
>>> out = max_along_batch(tensor, keepdim=True)
>>> out
torch.return_types.max(
values=tensor([[8, 9]]),
indices=tensor([[4, 4]]))

batchtensor.tensor.max_along_seq

max_along_seq(tensor: Tensor, keepdim: bool = False) -> max

Return the maximum of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
max

The first tensor will be populated with the maximum values and the second tensor, which must have dtype long, with their indices in the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import max_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = max_along_seq(tensor)
>>> out
torch.return_types.max(
values=tensor([4, 9]),
indices=tensor([4, 4]))
>>> out = max_along_seq(tensor, keepdim=True)
>>> out
torch.return_types.max(
values=tensor([[4], [9]]),
indices=tensor([[4], [4]]))

batchtensor.tensor.mean_along_batch

mean_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the mean of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The mean of all elements along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import mean_along_batch
>>> tensor = torch.tensor([[0.0, 1.0], [2.0, 3.0], [4.0, 5.0], [6.0, 7.0], [8.0, 9.0]])
>>> out = mean_along_batch(tensor)
>>> out
tensor([4., 5.])
>>> out = mean_along_batch(tensor, keepdim=True)
>>> out
tensor([[4., 5.]])

batchtensor.tensor.mean_along_seq

mean_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the mean of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The mean of all elements along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import mean_along_seq
>>> tensor = torch.tensor([[0.0, 1.0, 2.0, 3.0, 4.0], [5.0, 6.0, 7.0, 8.0, 9.0]])
>>> out = mean_along_seq(tensor)
>>> out
tensor([2., 7.])
>>> out = mean_along_seq(tensor, keepdim=True)
>>> out
tensor([[2.], [7.]])

batchtensor.tensor.median_along_batch

median_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> median

Return the median of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
median

The first tensor will be populated with the median values and the second tensor, which must have dtype long, with their indices in the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import median_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = median_along_batch(tensor)
>>> out
torch.return_types.median(
values=tensor([4, 5]),
indices=tensor([2, 2]))
>>> out = median_along_batch(tensor, keepdim=True)
>>> out
torch.return_types.median(
values=tensor([[4, 5]]),
indices=tensor([[2, 2]]))

batchtensor.tensor.median_along_seq

median_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> median

Return the median of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
median

The first tensor will be populated with the median values and the second tensor, which must have dtype long, with their indices in the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import median_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = median_along_seq(tensor)
>>> out
torch.return_types.median(
values=tensor([2, 7]),
indices=tensor([2, 2]))
>>> out = median_along_seq(tensor, keepdim=True)
>>> out
torch.return_types.median(
values=tensor([[2], [7]]),
indices=tensor([[2], [2]]))

batchtensor.tensor.min_along_batch

min_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> min

Return the minimum of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
min

The first tensor will be populated with the minimum values and the second tensor, which must have dtype long, with their indices in the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import min_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = min_along_batch(tensor)
>>> out
torch.return_types.min(
values=tensor([0, 1]),
indices=tensor([0, 0]))
>>> out = min_along_batch(tensor, keepdim=True)
>>> out
torch.return_types.min(
values=tensor([[0, 1]]),
indices=tensor([[0, 0]]))

batchtensor.tensor.min_along_seq

min_along_seq(tensor: Tensor, keepdim: bool = False) -> min

Return the minimum of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
min

The first tensor will be populated with the minimum values and the second tensor, which must have dtype long, with their indices in the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import min_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = min_along_seq(tensor)
>>> out
torch.return_types.min(
values=tensor([0, 5]),
indices=tensor([0, 0]))
>>> out = min_along_seq(tensor, keepdim=True)
>>> out
torch.return_types.min(
values=tensor([[0], [5]]),
indices=tensor([[0], [0]]))

batchtensor.tensor.permute_along_batch

permute_along_batch(
    tensor: Tensor, permutation: Tensor
) -> Tensor

Permute the tensor along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The tensor to split.

required
permutation Tensor

The 1-D tensor containing the indices of the permutation. The shape should match the batch dimension of the tensor.

required

Returns:

Type Description
Tensor

The tensor with permuted data along the batch dimension.

Raises:

Type Description
RuntimeError

if the shape of the permutation does not match the batch dimension of the tensor.

Example usage:

>>> import torch
>>> from batchtensor.tensor import permute_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = permute_along_batch(tensor, torch.tensor([2, 1, 3, 0, 4]))
>>> out
tensor([[4, 5],
        [2, 3],
        [6, 7],
        [0, 1],
        [8, 9]])

batchtensor.tensor.permute_along_seq

permute_along_seq(
    tensor: Tensor, permutation: Tensor
) -> Tensor

Permute the tensor along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The tensor to split.

required
permutation Tensor

The 1-D tensor containing the indices of the permutation. The shape should match the sequence dimension of the tensor.

required

Returns:

Type Description
Tensor

The tensor with permuted data along the sequence dimension.

Raises:

Type Description
RuntimeError

if the shape of the permutation does not match the sequence dimension of the tensor.

Example usage:

>>> import torch
>>> from batchtensor.tensor import permute_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = permute_along_seq(tensor, torch.tensor([2, 1, 3, 0, 4]))
>>> out
tensor([[2, 1, 3, 0, 4],
        [7, 6, 8, 5, 9]])

batchtensor.tensor.prod_along_batch

prod_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the product of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The product of all elements along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import prod_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = prod_along_batch(tensor)
>>> out
tensor([  0, 945])
>>> out = prod_along_batch(tensor, keepdim=True)
>>> out
tensor([[  0, 945]])

batchtensor.tensor.prod_along_seq

prod_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the product of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The product of all elements along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import prod_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = prod_along_seq(tensor)
>>> out
tensor([    0, 15120])
>>> out = prod_along_seq(tensor, keepdim=True)
>>> out
tensor([[    0], [15120]])

batchtensor.tensor.repeat_along_seq

repeat_along_seq(tensor: Tensor, repeats: int) -> Tensor

Repeat the data along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
repeats int

The number of times to repeat the data along the sequence dimension.

required

Returns:

Type Description
Tensor

A new tensor with the data repeated along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import repeat_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = repeat_along_seq(tensor, 2)
>>> out
tensor([[0, 1, 2, 3, 4, 0, 1, 2, 3, 4],
        [5, 6, 7, 8, 9, 5, 6, 7, 8, 9]])

batchtensor.tensor.select_along_batch

select_along_batch(tensor: Tensor, index: int) -> Tensor

Slice the input tensor along the batch dimension at the given index.

This function returns a view of the original tensor with the batch dimension removed.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
index int

The index to select with.

required

Returns:

Type Description
Tensor

The sliced tensor along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import select_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = select_along_batch(tensor, index=2)
>>> out
tensor([4, 5])

batchtensor.tensor.select_along_seq

select_along_seq(tensor: Tensor, index: int) -> Tensor

Slice the input tensor along the sequence dimension at the given index.

This function returns a view of the original tensor with the sequence dimension removed.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
index int

The index to select with.

required

Returns:

Type Description
Tensor

The sliced tensor along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import select_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = select_along_seq(tensor, index=2)
>>> out
tensor([2, 7])

batchtensor.tensor.shuffle_along_batch

shuffle_along_batch(
    tensor: Tensor, generator: Generator | None = None
) -> Tensor

Shuffle the tensor along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The tensor to split.

required
generator Generator | None

An optional random number generator.

None

Returns:

Type Description
Tensor

The shuffled tensor.

Example usage:

>>> import torch
>>> from batchtensor.tensor import shuffle_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = shuffle_along_batch(tensor)
>>> out
tensor([[...]])

batchtensor.tensor.shuffle_along_seq

shuffle_along_seq(
    tensor: Tensor, generator: Generator | None = None
) -> Tensor

Shuffle the tensor along the batch dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The tensor to split.

required
generator Generator | None

An optional random number generator.

None

Returns:

Type Description
Tensor

The shuffled tensor.

Example usage:

>>> import torch
>>> from batchtensor.tensor import shuffle_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = shuffle_along_seq(tensor)
>>> out
tensor([[...]])

batchtensor.tensor.slice_along_batch

slice_along_batch(
    tensor: Tensor,
    start: int = 0,
    stop: int | None = None,
    step: int = 1,
) -> Tensor

Slice the tensor along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
start int

The index where the slicing of object starts.

0
stop int | None

The index where the slicing of object stops. None means last.

None
step int

The increment between each index for slicing.

1

Returns:

Type Description
Tensor

The sliced tensor along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import slice_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = slice_along_batch(tensor, start=2)
>>> out
tensor([[4, 5],
        [6, 7],
        [8, 9]])
>>> out = slice_along_batch(tensor, stop=3)
>>> out
tensor([[0, 1],
        [2, 3],
        [4, 5]])
>>> out = slice_along_batch(tensor, step=2)
>>> out
tensor([[0, 1],
        [4, 5],
        [8, 9]])

batchtensor.tensor.slice_along_seq

slice_along_seq(
    tensor: Tensor,
    start: int = 0,
    stop: int | None = None,
    step: int = 1,
) -> Tensor

Slice the tensor along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
start int

The index where the slicing of object starts.

0
stop int | None

The index where the slicing of object stops. None means last.

None
step int

The increment between each index for slicing.

1

Returns:

Type Description
Tensor

The sliced tensor along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import slice_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [9, 8, 7, 6, 5]])
>>> out = slice_along_seq(tensor, start=2)
>>> out
tensor([[2, 3, 4],
        [7, 6, 5]])
>>> out = slice_along_seq(tensor, stop=3)
>>> out
tensor([[0, 1, 2],
        [9, 8, 7]])
>>> out = slice_along_seq(tensor, step=2)
>>> out
tensor([[0, 2, 4],
        [9, 7, 5]])

batchtensor.tensor.sort_along_batch

sort_along_batch(
    tensor: Tensor, descending: bool = False, **kwargs: Any
) -> sort

Sort the elements of the input tensor along the batch dimension in ascending order by value.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
descending bool

Controls the sorting order (ascending or descending).

False
kwargs Any

Additional keywords arguments for torch.sort.

{}

Returns:

Type Description
sort

A namedtuple of (values, indices), where the values are the sorted values and indices are the indices of the elements in the original input tensor.

Example usage:

>>> import torch
>>> from batchtensor.tensor import sort_along_batch
>>> tensor = torch.tensor([[2, 6], [0, 3], [4, 9], [8, 1], [5, 7]])
>>> out = sort_along_batch(tensor)
>>> out
torch.return_types.sort(
values=tensor([[0, 1], [2, 3], [4, 6], [5, 7], [8, 9]]),
indices=tensor([[1, 3], [0, 1], [2, 0], [4, 4], [3, 2]]))
>>> out = sort_along_batch(tensor, descending=True)
>>> out
torch.return_types.sort(
values=tensor([[8, 9], [5, 7], [4, 6], [2, 3], [0, 1]]),
indices=tensor([[3, 2], [4, 4], [2, 0], [0, 1], [1, 3]]))

batchtensor.tensor.sort_along_seq

sort_along_seq(
    tensor: Tensor, descending: bool = False, **kwargs: Any
) -> sort

Sort the elements of the input tensor along the sequence dimension in ascending order by value.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
descending bool

Controls the sorting order (ascending or descending).

False
kwargs Any

Additional keywords arguments for torch.sort.

{}

Returns:

Type Description
sort

A namedtuple of (values, indices), where the values are the sorted values and indices are the indices of the elements in the original input tensor.

Example usage:

>>> import torch
>>> from batchtensor.tensor import sort_along_seq
>>> tensor = torch.tensor([[7, 3, 0, 8, 5], [1, 9, 6, 4, 2]])
>>> out = sort_along_seq(tensor)
>>> out
torch.return_types.sort(
values=tensor([[0, 3, 5, 7, 8], [1, 2, 4, 6, 9]]),
indices=tensor([[2, 1, 4, 0, 3], [0, 4, 3, 2, 1]]))
>>> out = sort_along_seq(tensor, descending=True)
>>> out
torch.return_types.sort(
values=tensor([[8, 7, 5, 3, 0], [9, 6, 4, 2, 1]]),
indices=tensor([[3, 0, 4, 1, 2], [1, 2, 3, 4, 0]]))

batchtensor.tensor.split_along_batch

split_along_batch(
    tensor: Tensor,
    split_size_or_sections: int | Sequence[int],
) -> Tensor

Split the tensor into chunks along the batch dimension.

Each chunk is a view of the original tensor.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
split_size_or_sections int | Sequence[int]

Size of a single chunk or list of sizes for each chunk

required

Returns:

Type Description
Tensor

The tensor chunks.

Example usage:

>>> import torch
>>> from batchtensor.tensor import split_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> outputs = split_along_batch(tensor, split_size_or_sections=2)
>>> outputs
(tensor([[0, 1], [2, 3]]),
 tensor([[4, 5], [6, 7]]),
 tensor([[8, 9]]))

batchtensor.tensor.split_along_seq

split_along_seq(
    tensor: Tensor,
    split_size_or_sections: int | Sequence[int],
) -> Tensor

Split the tensor into chunks along the sequence dimension.

Each chunk is a view of the original tensor.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
split_size_or_sections int | Sequence[int]

Size of a single chunk or list of sizes for each chunk

required

Returns:

Type Description
Tensor

The tensor chunks.

Example usage:

>>> import torch
>>> from batchtensor.tensor import split_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> outputs = split_along_seq(tensor, split_size_or_sections=2)
>>> outputs
(tensor([[0, 1], [5, 6]]),
 tensor([[2, 3], [7, 8]]),
 tensor([[4], [9]]))

batchtensor.tensor.sum_along_batch

sum_along_batch(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the sum of all elements along the batch dimension.

Note

This function assumes the batch dimension is the first dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The sum of all elements along the batch dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import sum_along_batch
>>> tensor = torch.tensor([[0, 1], [2, 3], [4, 5], [6, 7], [8, 9]])
>>> out = sum_along_batch(tensor)
>>> out
tensor([20, 25])
>>> out = sum_along_batch(tensor, keepdim=True)
>>> out
tensor([[20, 25]])

batchtensor.tensor.sum_along_seq

sum_along_seq(
    tensor: Tensor, keepdim: bool = False
) -> Tensor

Return the sum of all elements along the sequence dimension.

Note

This function assumes the sequence dimension is the second dimension.

Parameters:

Name Type Description Default
tensor Tensor

The input tensor.

required
keepdim bool

Whether the output tensor has dim retained or not.

False

Returns:

Type Description
Tensor

The sum of all elements along the sequence dimension.

Example usage:

>>> import torch
>>> from batchtensor.tensor import sum_along_seq
>>> tensor = torch.tensor([[0, 1, 2, 3, 4], [5, 6, 7, 8, 9]])
>>> out = sum_along_seq(tensor)
>>> out
tensor([10, 35])
>>> out = sum_along_seq(tensor, keepdim=True)
>>> out
tensor([[10], [35]])