API

Create A New Model Based On An Existing Model

Create a new model in Alviss AI based on an existing one using the API.

This tutorial covers creating a new model in Alviss AI based on an existing model (a "remodel" process), training it from scratch with possibly different modeling assumptions, rather than just updating via refit. It demonstrates using both high-level and low-level graphs from the existing model. This is useful for experimenting with new configurations while leveraging existing model structures. We'll walk through the process step by step using Python with the requests library to interact with the Alviss AI API. For each step, the relevant valid Python code snippet is provided.

Note: Unlike refitting, this script does not include polling for completion, assuming the operations are synchronous or that status checks can be added if needed based on your workflow.

Prerequisites

  • You need a valid access token from the Alviss AI platform (see the Authentication section in the main API docs).
  • Know your team ID, project ID, and the target model ID to base the new model on.
  • Ensure the existing model has associated dataset and graph information accessible.
  • Install the required Python libraries if not already present: pip install requests (though it's standard).

Step 1: Import Necessary Libraries

Import requests for making HTTP API calls.

import requests

Step 2: Set Up Variables

Define the base API URL, your access token, team ID, project ID, and the target model ID. Replace placeholders like <SET ME> with actual values.

url = "https://app.alviss.io/api/v1/api"
token = "<SET ME>"
team_id = "<SET ME>"
project_id = "<SET ME>"
target_model_id = 11

Step 3: Prepare Authentication Headers

Create a headers dictionary with the Authorization Bearer token to authenticate all API requests.

headers = {"Authorization": "Bearer " + token}

Step 4: Construct the Project URL

Build the base URL for the team and project-specific endpoints by formatting the team ID and project ID into the URL string.

team_project_url = url + f"/projects/{team_id}/{project_id}"

Step 5: Retrieve Existing Model Information

Send a GET request to the /models/{target_model_id} endpoint to fetch details about the existing model. Extract the high-level graph ID and dataset ID from the response.

response = requests.get(
    team_project_url + f"/models/{target_model_id}",
    headers=headers,
)
model_info = response.json()
high_level_graph_id = model_info["GraphInstance"]["Graph"]["Id"]
dataset_id = model_info["DataSet"]["IId"]

Step 6: Retrieve High-Level Graph

Send a GET request to the /build/graphs/{high_level_graph_id} endpoint to fetch the high-level graph structure (nodes and edges).

response = requests.get(
    team_project_url + f"/build/graphs/{high_level_graph_id}",
    headers=headers,
)
high_level_graph = response.json()

Step 7: Retrieve Dataset Dates

Send a POST request to the /datasets/{dataset_id}/dates endpoint with the dataset name as a parameter and model-specific filters (country, region, grouping from the existing model info) in the JSON payload. This returns a list of data_dates.

response = requests.post(
    team_project_url + f"/datasets/{dataset_id}/dates",
    headers=headers,
    params={"dataset_name": "Sales"},
    json=[
        {
            "country_code": model_info["Country"],
            "region_code": model_info["Region"],
            "grouping": model_info["Grouping"],
        }
    ],
)
data_dates = response.json()

Step 8: Split Dates for Training and Evaluation

Calculate a separator index at 75% of the data dates length, then slice the list to create train_dates and eval_dates for the new model.

separator_date = int(len(data_dates) * 0.75)
train_dates = data_dates[:separator_date]
eval_dates = data_dates[separator_date:]

Step 9: Submit New Model Using High-Level Graph

Send a POST request to the /build/model endpoint with model parameters, details (including dataset, dates, activation flag, and modeling combination), and the high-level graph in the JSON payload. This creates and submits the new model.

response = requests.post(
    team_project_url + "/build/model",
    headers=headers,
    json={
        "model_param": {
            "epochs": 50,
            "learning_rate": 0.01,
            "samples": 2,
        },
        "model_detail": {
            "dataset_id": dataset_id,
            "train_dates": train_dates,
            "eval_dates": eval_dates,
            "activate": False,  # auto activate
            "modelling_combination": {
                "country_code": model_info["Country"],
                "region_code": model_info["Region"],
                "grouping": model_info["Grouping"],
            },
        },
        "graph_detail": {
            "graph": {
                "nodes": high_level_graph["nodes"],
                "edges": high_level_graph["edges"],
            }
        },
    },
)

Step 10: Retrieve Low-Level Graph ID

Extract the low-level graph ID from the existing model info.

low_level_graph_id = model_info["GraphInstance"]["Id"]

Step 11: Retrieve Low-Level Graph

Send a GET request to the /graph_instances/{team_id}/{low_level_graph_id} endpoint to fetch the low-level graph structure (nodes and edges). Note: This uses the base url, not the team_project_url.

response = requests.get(
    url + f"/graph_instances/{team_id}/{low_level_graph_id}",
    headers=headers,
)
low_level_graph = response.json()

Step 12: Submit New Model Using Low-Level Graph

Send a POST request to the /build/model endpoint with similar model parameters and details, but using the low-level graph in the JSON payload. This creates another new model variant.

response = requests.post(
    team_project_url + "/build/model",
    headers=headers,
    json={
        "model_param": {
            "epochs": 50,
            "learning_rate": 0.01,
            "samples": 2,
        },
        "model_detail": {
            "dataset_id": dataset_id,
            "train_dates": train_dates,
            "eval_dates": eval_dates,
            "activate": False,  # auto activate
            "modelling_combination": {
                "country_code": model_info["Country"],
                "region_code": model_info["Region"],
                "grouping": model_info["Grouping"],
            },
        },
        "graph_detail": {
            "graph": {
                "nodes": low_level_graph["nodes"],
                "edges": low_level_graph["edges"],
            }
        },
    },
)

Full Example Code

import requests

url = "https://app.alviss.io/api/v1/api"
token = "<SET ME>"
team_id = "<SET ME>"
project_id = "<SET ME>"

headers = {"Authorization": "Bearer " + token}
team_project_url = url + f"/projects/{team_id}/{project_id}"

target_model_id = 11

response = requests.get(
    team_project_url + f"/models/{target_model_id}",
    headers=headers,
)
model_info = response.json()
high_level_graph_id = model_info["GraphInstance"]["Graph"]["Id"]
dataset_id = model_info["DataSet"]["IId"]

response = requests.get(
    team_project_url + f"/build/graphs/{high_level_graph_id}",
    headers=headers,
)
high_level_graph = response.json()

response = requests.post(
    team_project_url + f"/datasets/{dataset_id}/dates",
    headers=headers,
    params={"dataset_name": "Sales"},
    json=[
        {
            "country_code": model_info["Country"],
            "region_code": model_info["Region"],
            "grouping": model_info["Grouping"],
        }
    ],
)

data_dates = response.json()
separator_date = int(len(data_dates) * 0.75)
train_dates = data_dates[:separator_date]
eval_dates = data_dates[separator_date:]

# submit the model using high level graph
response = requests.post(
    team_project_url + "/build/model",
    headers=headers,
    json={
        "model_param": {
            "epochs": 50,
            "learning_rate": 0.01,
            "samples": 2,
        },
        "model_detail": {
            "dataset_id": dataset_id,
            "train_dates": train_dates,
            "eval_dates": eval_dates,
            "activate": False,  # auto activate
            "modelling_combination": {
                "country_code": model_info["Country"],
                "region_code": model_info["Region"],
                "grouping": model_info["Grouping"],
            },
        },
        "graph_detail": {
            "graph": {
                "nodes": high_level_graph["nodes"],
                "edges": high_level_graph["edges"],
            }
        },
    },
)

# build model with low level graph
low_level_graph_id = model_info["GraphInstance"]["Id"]
response = requests.get(
    url + f"/graph_instances/{team_id}/{low_level_graph_id}",
    headers=headers,
)
low_level_graph = response.json()
response = requests.post(
    team_project_url + "/build/model",
    headers=headers,
    json={
        "model_param": {
            "epochs": 50,
            "learning_rate": 0.01,
            "samples": 2,
        },
        "model_detail": {
            "dataset_id": dataset_id,
            "train_dates": train_dates,
            "eval_dates": eval_dates,
            "activate": False,  # auto activate
            "modelling_combination": {
                "country_code": model_info["Country"],
                "region_code": model_info["Region"],
                "grouping": model_info["Grouping"],
            },
        },
        "graph_detail": {
            "graph": {
                "nodes": low_level_graph["nodes"],
                "edges": low_level_graph["edges"],
            }
        },
    },
)