USE(7.3)
USE(7.3)
AbEC - Adding a New Metric
<< >>
< [ Home ] < 1. [ Use ] > 2. [ Configuration files ] > 3. [ Fitness function ] > 4. [ Running ] > 5. [ Results ] > 6. [ Example ] > 7. [ Adding new components ] >> 7.1. [ Adding a New Optimizer ] >>>> 7.1.1. [ Example of a New Optimizer ] >> 7.2. [ Adding a New Component ] >>>> 7.2.1. [ Example of a New Component ] >> 7.3. [ Adding a New Metric File ] >>>> 7.3.1. [ Example of a New Metric ]

> AbEC/abec/metrics/metric.py

This is the file where we are going to implement the metric. First of all, lets talk about some requirements.

> Hyper-parameters

In order to set the hyper-parameters of the metric, inside of the file it is necessary to be defined one list, named params, where each element of this list we put the hyper-parameters of our new metric. Below a generic example of this list.

params = ["HYPER-PARAMETER-1", "HYPER-PARAMETER-2", ..., "HYPER-PARAMETER-N"]






> Vars

These are the variables used by the metric during the process of optimization, this variables are for example auxiliar variables used to keep some value that need to be used in a future generation. We define as a list, named vars, where each element of the list is one variable of our new metric. Below a generic example of this list.

vars = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]






> Log

These are the variables of the Vars list defined above that will recorded in the log file We define as a list, named log, where each element of the list is one variable of our new metric. Below a generic example of this list.

log = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]






> Scope

Once we have defined the hyper-parameters, the auxiliar variables and the variables that should be recorded in the log file, we need to define in which scope of the optimization process our new metric is going to act.
To deal with this, we have 3 scopes in which we can define our metric and a list is used to indicate which of them the metric should be. The scopes are:
  • Individual (IND): Measures that needs to be calculated in each evaluation of the optimization process. For example, the Offline Error is the metric commonly used in Dynamic Optimization Problems, and for this metrics it is necessary

    params = ["HYPER-PARAMETER-1", "HYPER-PARAMETER-2", ..., "HYPER-PARAMETER-N"]
    vars = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]
    log = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]
    scope = ["IND"]

    def metric(var_metric, runVars, parameters, ind=0):
        .
        .
        .
        return var_metric

    def finishMetric(var_metric, path):
        .
        .
        .
        return var_metric


    The template of a new (IND) metric can be downloaded in the link: [ Metric template ].




  • Generation (GEN): Measures calculated only in the end of each generation. For example, the Online Error which is the most common metric used in Evolutionary Computation and also to visualize the performance curve of the algorithm.

    params = ["HYPER-PARAMETER-1", "HYPER-PARAMETER-2", ..., "HYPER-PARAMETER-N"]
    vars = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]
    log = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]
    scope = ["GEN"]

    def metric(var_metric, runVars, parameters, ind=0):
        .
        .
        .
        return var_metric

    def finishMetric(var_metric, path):
        .
        .
        .
        return var_metric


    The template of a new (GEN) metric can be downloaded in the link: [ Metric template ].




  • RUN (RUN): Measures calculated only in the end of each run. Usually metrics that results in a number and do not produce a graphic or some kind of curve. For example, Best Error, which is the best error found during the optimization process

    params = ["HYPER-PARAMETER-1", "HYPER-PARAMETER-2", ..., "HYPER-PARAMETER-N"]
    vars = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]
    log = ["metric", "auxiliar_variable-1", ..., "auxiliar_variable-n"]
    scope = ["RUN"]

    def metric(var_metric, runVars, parameters, ind=0):
        .
        .
        .
        return var_metric

    def finishMetric(var_metric, path):
        .
        .
        .
        return var_metric


    The template of a new (RUN) metric can be downloaded in the link: [ Component template ].







> Confirmation of hyper-parameters (cp)

This method is used to confirm that all the values of the hyper-parameters are within the allowed range. A generic example of this method is showed below:

from aux.aux import errorWarning
import sys

def cp(parameters):
    if not (MIN <= parameters["<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-1>"] <= MAX):
       errorWarning("4.X.1", "algoConfig.ini", "<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-1>", "The <HYPER-PARAMETER-1> should be in the interval [MIN, MAX]")
       sys.exit()

    if not (MIN <= parameters["<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-2>"] <= MAX):
       errorWarning("4.X.2", "algoConfig.ini", "<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-2>", "The <HYPER-PARAMETER-2> should be in the interval [MIN, MAX]")
       sys.exit()

    if not (MIN <= parameters["<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-N>"] <= MAX):
       errorWarning("4.X.N", "algoConfig.ini", "<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-N>", "The <HYPER-PARAMETER-N> should be in the interval [MIN, MAX]")
       sys.exit()

    return 1





> metric.py

And then the complete file will be something like:

from aux.aux import errorWarning
import sys

params = ["HYPER-PARAMETER-1", "HYPER-PARAMETER-2", ..., "HYPER-PARAMETER-N"]
scope = ["XXX"]

def cp(parameters):
    if not (MIN <= parameters["<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-1>"] <= MAX):
       errorWarning("3.X.1", "algoConfig.ini", "<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-1>", "The <HYPER-PARAMETER-1> should be in the interval [MIN, MAX]")
       sys.exit()

    if not (MIN <= parameters["<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-2>"] <= MAX):
       errorWarning("3.X.2", "algoConfig.ini", "<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-2>", "The <HYPER-PARAMETER-2> should be in the interval [MIN, MAX]")
       sys.exit()

    if not (MIN <= parameters["<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-N>"] <= MAX):
       errorWarning("3.X.N", "algoConfig.ini", "<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-N>", "The <HYPER-PARAMETER-N> should be in the interval [MIN, MAX]")
       sys.exit()

    return 1

def metric(var_metric, runVars, parameters, ind=0):
    .
    .
    .
    return var_metric

def finishMetric(var_metric, path):
    .
    .
    .
    return var_metric





> expConfig.ini

Once you have added the component script in the respective folder, you will be able to configure this component in the configuration file "expConfig.ini". The component will appers in the file as follows:


{
"MTC_<NAME-OF-THE-METRIC>": 1 -> 0 or 1; 0 for turned off and 1 for turned on (bool)
}

The metric parameters:

{
"MTC_<NAME-OF-THE-METRIC>": 1, -> 0 or 1; 0 for turned off and 1 for turned on (bool)
"MTC_<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-1>": value, -> range permitted for the value (type)
"MTC_<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-2>": value, -> range permitted for the value (type)
.
.
"MTC_<NAME-OF-THE-METRIC>_<HYPER-PARAMETER-N>": value -> range permitted for the value (type)
}

[~]$ Example of a new Metric
As an example, we are going now to add a metric, the Offline Error (eo).

> AbEC/abec/metrics/eo.py


$ nano abec/metrics/eo.py


> [ file ]

import sys
import copy
import abec
import aux.globalVar as globalVar
from aux.aux import errorWarning

params = ["ETRY", "RLS"]
scope = ["GET"]

def cp(parameters):
    if not (0 < parameters["COMP_LOCAL_SEARCH_ETRY"] < parameters["POPSIZE"]):
       errorWarning("4.4.1", "algoConfig.ini", "COMP_LOCAL_SEARCH_ETRY", "The number of tries of the Local Search should be 0 between [1, POPSIZE[")
       sys.exit()

    if not (0 < parameters["COMP_LOCAL_SEARCH_RLS"] < parameters["MAX_POS"]):
       errorWarning("3.4.2", "algoConfig.ini", "COMP_LOCAL_SEARCH_RLS", "The Local Search radio should be 0 between ]0, MAX_POS[")
       sys.exit()

    return 1

def component(best, parameters):
    bp = copy.deepcopy(best)
    for _ in range(parameters["COMP_LOCAL_SEARCH_ETRY"]):
       for i in range(parameters["NDIM"]):
          bp["pos"][i] = bp["pos"][i] + globalVar.rng.uniform(-1, 1)*parameters["COMP_LOCAL_SEARCH_RLS"]
       bp["fit"] = abec.evaluate(bp, parameters, be = 1)
       if bp["fit"] < best["fit"]:
          best = copy.deepcopy(bp)

    return best



And then to use it in the experiment, we set it up in the configuration file expConfig.ini, just like this:

$ nano abec/expConfig.ini


> [ file ]

{
"ALGORITHM": "DE_LS",
"POPSIZE": 50,
"MIN_POS": -10,
"MAX_POS": 10,
"DE_POP_PERC": 1,
"DE_F": 0.5,
"DE_CR": 0.7,
"COMP_LOCAL_SEARCH": 1,
"COMP_LOCAL_SEARCH_ETRY": 10,
"COMP_LOCAL_SEARCH_RLS": 1
}

Be free to use this framework.
AbEC © 2023.