pub struct LayerNorm<B>where
B: Backend,{ /* private fields */ }Expand description
Implementations§
Source§impl<B> LayerNorm<B>where
B: Backend,
impl<B> LayerNorm<B>where
B: Backend,
Sourcepub fn new(
normalized_size: usize,
eps: f64,
dtype: DType,
device: &<B as Backend>::Device,
) -> Result<LayerNorm<B>, Error>
pub fn new( normalized_size: usize, eps: f64, dtype: DType, device: &<B as Backend>::Device, ) -> Result<LayerNorm<B>, Error>
Create a new LayerNorm layer.
§Arguments
normalized_size: size of the last dimension to normalizeeps: numerical stability constant (typically 1e-5)dtype: data type for parametersdevice: device to create parameters on
Sourcepub fn from_tensors(
weight: Tensor<B>,
bias: Tensor<B>,
eps: f64,
) -> Result<LayerNorm<B>, Error>
pub fn from_tensors( weight: Tensor<B>, bias: Tensor<B>, eps: f64, ) -> Result<LayerNorm<B>, Error>
Create from existing weight and bias tensors.
pub fn eps(&self) -> f64
pub fn normalized_size(&self) -> usize
Trait Implementations§
Source§impl<B> Module<B> for LayerNorm<B>where
B: Backend,
impl<B> Module<B> for LayerNorm<B>where
B: Backend,
Source§fn forward(&self, x: &Tensor<B>) -> Result<Tensor<B>, Error>
fn forward(&self, x: &Tensor<B>) -> Result<Tensor<B>, Error>
Forward pass: normalize over last dimension, then scale + shift.
For input [batch, seq, d_model]:
- mean = mean(x, dim=-1, keepdim=true) → [batch, seq, 1]
- var = var(x, dim=-1, keepdim=true) → [batch, seq, 1]
- x_norm = (x - mean) / sqrt(var + eps) → [batch, seq, d_model]
- output = x_norm * γ + β → [batch, seq, d_model]
Source§fn parameters(&self) -> Vec<Tensor<B>>
fn parameters(&self) -> Vec<Tensor<B>>
Return all trainable parameters of this module.
The optimizer uses these to update weights during training.
Source§fn named_parameters(&self) -> Vec<(String, Tensor<B>)>
fn named_parameters(&self) -> Vec<(String, Tensor<B>)>
Return all trainable parameters with human-readable names. Read more
Source§fn set_training(&self, _training: bool)
fn set_training(&self, _training: bool)
Set training or evaluation mode. Read more
Source§fn is_training(&self) -> bool
fn is_training(&self) -> bool
Whether the module is in training mode (default: true).
Source§fn num_parameters(&self) -> usize
fn num_parameters(&self) -> usize
Total number of scalar parameters in this module.
Source§fn trainable_params_count(&self) -> usize
fn trainable_params_count(&self) -> usize
Number of trainable (variable) parameters.
Auto Trait Implementations§
impl<B> Freeze for LayerNorm<B>
impl<B> RefUnwindSafe for LayerNorm<B>
impl<B> Send for LayerNorm<B>
impl<B> Sync for LayerNorm<B>
impl<B> Unpin for LayerNorm<B>
impl<B> UnwindSafe for LayerNorm<B>
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
Converts
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
Converts
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more