Python SDK
computalot is the Python SDK for Computalot.
Install
python3 -m pip install --user --break-system-packages \
https://computalot.com/docs/downloads/computalot-0.2.0-py3-none-any.whl
export PATH="$HOME/.local/bin:$PATH"Setup
export COMPUTALOT_CONTROLLER_URL="https://computalot.com"
export COMPUTALOT_API_TOKEN="YOUR_TOKEN"Quick start: first authenticated probe
from computalot import ComputalotClient
client = ComputalotClient(
controller_url="https://computalot.com",
token="YOUR_TOKEN",
)
docs = client.docs_index()
recipes = client.list_recipes()
jobs = client.list_jobs(limit=5)
print(docs["status"])
print(len(recipes.get("recipes", [])))
print(len(jobs.get("jobs", [])))Once you have a ready project or recipe payload, use the submit examples below.
Submitting jobs
# Simple
job = client.submit_structured(
runner_command=["python", "evaluate.py"],
payload={"dataset": "smoke"},
project="my-project",
)
# With fan-out
job = client.submit_structured(
runner_command=["python", "evaluate.py"],
payload={"models": ["gpt-4", "claude"], "config": {}},
fan_out={"by": "models"},
merge_strategy="keyed",
project="my-project",
)
# With GPU requirements
job = client.submit_job({
"type": "structured_runner",
"runner_command": ["python", "train.py"],
"payload": {"epochs": 100},
"project": "my-project",
"requirements": {"profile": "gpu", "gpu_count": 1},
"checkpointing": {"enabled": True, "resume_from_latest": True},
})
# With dependencies (DAG)
eval_job = client.submit_structured(
runner_command=["python", "evaluate.py"],
depends_on=[train_job["id"]],
project="my-project",
)Inspecting jobs
job = client.get_job(job_id)
tasks = client.job_tasks(job_id)
events = client.job_events(job_id)
metrics = client.job_metrics(job_id)
latest_progress = tasks["tasks"][0].get("latest_progress")
latest_checkpoint = tasks["tasks"][0].get("checkpoint")
resume_state = tasks["tasks"][0].get("resume_state")For long-running checkpointed jobs, latest_checkpoint can include durable publication fields like artifact_id, artifact_source, publish_status, and published_at. On retry, _resume.checkpoint.path is rewritten to the downloaded local checkpoint path when the latest checkpoint was published as an artifact.
Artifacts
meta = client.upload_artifact("/path/to/file.parquet", filename="file.parquet")
direct_meta = client.upload_artifact_direct(
"/path/to/large-file.parquet",
filename="large-file.parquet",
)
multipart_meta = client.upload_artifact_multipart(
"/path/to/huge-checkpoint.safetensors",
filename="huge-checkpoint.safetensors",
)
client.download_artifact(meta["id"], "/path/to/output.parquet")
artifacts = client.list_artifacts()download_artifact() prefers the signed object-store URL from artifact metadata when available, resumes from <dest>.partial with HTTP Range when supported, and verifies the completed file against artifact size and SHA-256 metadata before the final rename.
Sealed Recipe Jobs
# Submit a recipe job (no project needed)
job = client.submit_job({
"recipe": "packing",
"payload": {
"operation": "eval",
"candidate": [0.12, 0.31, 0.88],
},
"timeout_s": 300,
})
final = client.wait_for_job(job["id"])
results = client.get_results(job["id"])CLI
export COMPUTALOT_CONTROLLER_URL="https://computalot.com"
export COMPUTALOT_API_TOKEN="YOUR_TOKEN"
computalot docs --llm
computalot jobs --limit 5
computalot job <job_id>Once a project is ready, the submit helpers are:
computalot submit --project my-project python -c "print('done')"
computalot run --project my-project python evaluate.py
computalot run --project my-project --gpu python train.pyLast updated on