MultitargetSRTestRegressor
MultitargetSRTestRegressor
A model type for constructing a Multi-Target Symbolic Regression via Evolutionary Search, based on SymbolicRegression.jl, and implementing the MLJ model interface.
From MLJ, the type can be imported using
MultitargetSRTestRegressor = @load MultitargetSRTestRegressor pkg=SymbolicRegression
Do model = MultitargetSRTestRegressor()
to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in MultitargetSRTestRegressor(defaults=...)
.
Hyper-parameters
defaults = nothing
binary_operators = nothing
unary_operators = nothing
maxsize = nothing
maxdepth = nothing
expression_spec = nothing
populations = nothing
population_size = nothing
ncycles_per_iteration = nothing
elementwise_loss = nothing
loss_function = nothing
loss_function_expression = nothing
dimensional_constraint_penalty = nothing
parsimony = nothing
constraints = nothing
nested_constraints = nothing
complexity_of_operators = nothing
complexity_of_constants = nothing
complexity_of_variables = nothing
warmup_maxsize_by = nothing
adaptive_parsimony_scaling = nothing
mutation_weights = nothing
crossover_probability = nothing
annealing = nothing
alpha = nothing
tournament_selection_n = nothing
tournament_selection_p = nothing
early_stop_condition = nothing
batching = nothing
batch_size = nothing
dimensionless_constants_only = false
complexity_mapping = nothing
use_frequency = true
use_frequency_in_tournament = true
should_simplify = nothing
perturbation_factor = nothing
probability_negate_constant = nothing
skip_mutation_failures = true
- `optimizer_algorithm = Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}(LineSearches.InitialStatic{Float64} alpha: Float64 1.0 scaled: Bool false
, LineSearches.BackTracking{Float64, Int64} c1: Float64 0.0001 ρhi: Float64 0.5 ρ_lo: Float64 0.1 iterations: Int64 1000 order: Int64 3 maxstep: Float64 Inf cache: Nothing nothing , nothing, nothing, Optim.Flat())`
optimizer_nrestarts = 2
optimizer_probability = 0.14
optimizer_iterations = nothing
optimizer_f_calls_limit = nothing
optimizer_options = nothing
should_optimize_constants = true
migration = true
hof_migration = true
fraction_replaced = nothing
fraction_replaced_hof = nothing
topn = nothing
timeout_in_seconds = nothing
max_evals = nothing
input_stream = Base.TTY(RawFD(9) paused, 0 bytes waiting)
turbo = false
bumper = false
autodiff_backend = nothing
deterministic = false
seed = nothing
verbosity = nothing
print_precision = 5
progress = nothing
output_directory = nothing
save_to_file = true
bin_constraints = nothing
una_constraints = nothing
terminal_width = nothing
use_recorder = false
recorder_file = pysr_recorder.json
define_helper_functions = true
expression_type = nothing
expression_options = nothing
node_type = nothing
output_file = nothing
fast_cycle = false
npopulations = nothing
npop = nothing
niterations = 1
parallelism = multithreading
numprocs = nothing
procs = nothing
addprocs_function = nothing
heap_size_hint_in_bytes = nothing
worker_imports = nothing
logger = nothing
runtests = true
run_id = nothing
loss_type = Nothing
selection_method = choose_best
dimensions_type = DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}