Class: OCI::GenerativeAi::Models::LoraTrainingConfig
- Inherits:
-
TrainingConfig
- Object
- TrainingConfig
- OCI::GenerativeAi::Models::LoraTrainingConfig
- Defined in:
- lib/oci/generative_ai/models/lora_training_config.rb
Overview
The Lora training method hyperparameters.
Constant Summary
Constants inherited from TrainingConfig
TrainingConfig::TRAINING_CONFIG_TYPE_ENUM
Instance Attribute Summary collapse
-
#lora_alpha ⇒ Integer
This parameter represents the scaling factor for the weight matrices in LoRA.
-
#lora_dropout ⇒ Float
This parameter indicates the dropout probability for LoRA layers.
-
#lora_r ⇒ Integer
This parameter represents the LoRA rank of the update matrices.
Attributes inherited from TrainingConfig
#early_stopping_patience, #early_stopping_threshold, #learning_rate, #log_model_metrics_interval_in_steps, #total_training_epochs, #training_batch_size, #training_config_type
Class Method Summary collapse
-
.attribute_map ⇒ Object
Attribute mapping from ruby-style variable name to JSON key.
-
.swagger_types ⇒ Object
Attribute type mapping.
Instance Method Summary collapse
-
#==(other) ⇒ Object
Checks equality by comparing each attribute.
-
#build_from_hash(attributes) ⇒ Object
Builds the object from hash.
- #eql?(other) ⇒ Boolean
-
#hash ⇒ Fixnum
Calculates hash code according to all attributes.
-
#initialize(attributes = {}) ⇒ LoraTrainingConfig
constructor
Initializes the object.
-
#to_hash ⇒ Hash
Returns the object in the form of hash.
-
#to_s ⇒ String
Returns the string representation of the object.
Methods inherited from TrainingConfig
Constructor Details
#initialize(attributes = {}) ⇒ LoraTrainingConfig
Initializes the object
76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 76 def initialize(attributes = {}) return unless attributes.is_a?(Hash) attributes['trainingConfigType'] = 'LORA_TRAINING_CONFIG' super(attributes) # convert string to symbol for hash key attributes = attributes.each_with_object({}) { |(k, v), h| h[k.to_sym] = v } self.lora_r = attributes[:'loraR'] if attributes[:'loraR'] raise 'You cannot provide both :loraR and :lora_r' if attributes.key?(:'loraR') && attributes.key?(:'lora_r') self.lora_r = attributes[:'lora_r'] if attributes[:'lora_r'] self.lora_alpha = attributes[:'loraAlpha'] if attributes[:'loraAlpha'] raise 'You cannot provide both :loraAlpha and :lora_alpha' if attributes.key?(:'loraAlpha') && attributes.key?(:'lora_alpha') self.lora_alpha = attributes[:'lora_alpha'] if attributes[:'lora_alpha'] self.lora_dropout = attributes[:'loraDropout'] if attributes[:'loraDropout'] raise 'You cannot provide both :loraDropout and :lora_dropout' if attributes.key?(:'loraDropout') && attributes.key?(:'lora_dropout') self.lora_dropout = attributes[:'lora_dropout'] if attributes[:'lora_dropout'] end |
Instance Attribute Details
#lora_alpha ⇒ Integer
This parameter represents the scaling factor for the weight matrices in LoRA.
19 20 21 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 19 def lora_alpha @lora_alpha end |
#lora_dropout ⇒ Float
This parameter indicates the dropout probability for LoRA layers.
23 24 25 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 23 def lora_dropout @lora_dropout end |
#lora_r ⇒ Integer
This parameter represents the LoRA rank of the update matrices.
15 16 17 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 15 def lora_r @lora_r end |
Class Method Details
.attribute_map ⇒ Object
Attribute mapping from ruby-style variable name to JSON key.
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 26 def self.attribute_map { # rubocop:disable Style/SymbolLiteral 'training_config_type': :'trainingConfigType', 'total_training_epochs': :'totalTrainingEpochs', 'learning_rate': :'learningRate', 'training_batch_size': :'trainingBatchSize', 'early_stopping_patience': :'earlyStoppingPatience', 'early_stopping_threshold': :'earlyStoppingThreshold', 'log_model_metrics_interval_in_steps': :'logModelMetricsIntervalInSteps', 'lora_r': :'loraR', 'lora_alpha': :'loraAlpha', 'lora_dropout': :'loraDropout' # rubocop:enable Style/SymbolLiteral } end |
.swagger_types ⇒ Object
Attribute type mapping.
44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 44 def self.swagger_types { # rubocop:disable Style/SymbolLiteral 'training_config_type': :'String', 'total_training_epochs': :'Integer', 'learning_rate': :'Float', 'training_batch_size': :'Integer', 'early_stopping_patience': :'Integer', 'early_stopping_threshold': :'Float', 'log_model_metrics_interval_in_steps': :'Integer', 'lora_r': :'Integer', 'lora_alpha': :'Integer', 'lora_dropout': :'Float' # rubocop:enable Style/SymbolLiteral } end |
Instance Method Details
#==(other) ⇒ Object
Checks equality by comparing each attribute.
112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 112 def ==(other) return true if equal?(other) self.class == other.class && training_config_type == other.training_config_type && total_training_epochs == other.total_training_epochs && learning_rate == other.learning_rate && training_batch_size == other.training_batch_size && early_stopping_patience == other.early_stopping_patience && early_stopping_threshold == other.early_stopping_threshold && log_model_metrics_interval_in_steps == other.log_model_metrics_interval_in_steps && lora_r == other.lora_r && lora_alpha == other.lora_alpha && lora_dropout == other.lora_dropout end |
#build_from_hash(attributes) ⇒ Object
Builds the object from hash
151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 151 def build_from_hash(attributes) return nil unless attributes.is_a?(Hash) self.class.swagger_types.each_pair do |key, type| if type =~ /^Array<(.*)>/i # check to ensure the input is an array given that the the attribute # is documented as an array but the input is not if attributes[self.class.attribute_map[key]].is_a?(Array) public_method("#{key}=").call( attributes[self.class.attribute_map[key]] .map { |v| OCI::Internal::Util.convert_to_type(Regexp.last_match(1), v) } ) end elsif !attributes[self.class.attribute_map[key]].nil? public_method("#{key}=").call( OCI::Internal::Util.convert_to_type(type, attributes[self.class.attribute_map[key]]) ) end # or else data not found in attributes(hash), not an issue as the data can be optional end self end |
#eql?(other) ⇒ Boolean
131 132 133 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 131 def eql?(other) self == other end |
#hash ⇒ Fixnum
Calculates hash code according to all attributes.
140 141 142 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 140 def hash [training_config_type, total_training_epochs, learning_rate, training_batch_size, early_stopping_patience, early_stopping_threshold, log_model_metrics_interval_in_steps, lora_r, lora_alpha, lora_dropout].hash end |
#to_hash ⇒ Hash
Returns the object in the form of hash
184 185 186 187 188 189 190 191 192 193 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 184 def to_hash hash = {} self.class.attribute_map.each_pair do |attr, param| value = public_method(attr).call next if value.nil? && !instance_variable_defined?("@#{attr}") hash[param] = _to_hash(value) end hash end |
#to_s ⇒ String
Returns the string representation of the object
178 179 180 |
# File 'lib/oci/generative_ai/models/lora_training_config.rb', line 178 def to_s to_hash.to_s end |