Code Submissions
All algorithms in TIG are open source, and you can develop your own novel algorithm or improve an existing one.
Getting Started
Follow the guide below that will walk you through the entire process of developing and submitting code to TIG.
Setting up the Developer Environment
Presently, code submissions are restricted to Rust, and are compiled into a shared object for execution by Benchmarkers through tig-runtime. Rust was chosen for its performance advantages over other languages, enhancing commercial viability of algorithms contributed to TIG, particularly in high-performance computing. Future iterations of TIG will support additional compatible languages.
To facilitate this, Innovators must set up a Rust development environment.
We recommend developing using Visual Studio Code with Rust plugins:
You can install these plugins by searching for them in the Visual Studio Code Extensions Marketplace.
Creating my_first_algo
First, you will need to clone TIG Monorepo:
git clone https://github.com/tig-foundation/tig-monorepoNext, you will need to decide which Challenge you want to work on. All Challenge descriptions and code can be found in tig-challenges
Once you have selected a challenge to work on, you will need to do two things:
- Make a copy of
tig-algorithms/src/<CHALLENGE>/template.rsand name itmy_first_algo.rs- If you are working on a GPU challenge, you will also need to copy and rename
template.cu. This is for implementing CUDA kernels.
- If you are working on a GPU challenge, you will also need to copy and rename
- Edit
tig-algorithms/src/<CHALLENGE>/mod.rsand add the line:pub mod my_first_algo;
You are now ready to compile and test your algorithm!
Compiling & Testing Your Code
TIG provides a dev docker image for each challenge, containing the environment for compiling and testing your code.
You will need to have docker installed on your machine.
# cd to tig-monorepo folder
# replace <CHALLENGE>. e.g. satisfiability
docker run -it -v $(pwd):/app ghcr.io/tig-foundation/tig-monorepo/<CHALLENGE>/dev:latest
# inside the docker
build_algorithm my_first_algo
# see notes below for <TRACK> <HYPERPARAMETERS>
test_algorithm my_first_algo <TRACK> <HYPERPARAMETERS>Next is to edit your code so that you find solutions!
Notes for test_algorithm:
-
Recommended track to check your code works:
# Replace <TRACK> with: n_vars=5000,clauses_to_variables_ratio=4267 # for satisfiability n_nodes=600 # for vehicle_routing n_items=1000,budget=5 # for knapsack n_queries=2000 # for vector_search n_h_edges=10000 # for hypergraph n_hidden=4 # for neuralnet_optimizer -
Recommended hyperparameters setting to check your code works:
# Replace <HYPERPARAMETERS> with null -
Use
--helpto see more options -
Use
--verboseto output individual instance commands that you can run. For example:/usr/local/bin/tig-runtime '{"algorithm_id":"","challenge_id":"c005","track_id":"num_hyperedges=10000","block_id":"","player_id":""}' rand_hash 99 ./tig-algorithms/lib/hypergraph/arm64/hyper_improved.so --fuel 100000000000Running these commands will get you the full stdout, such as any
printlnyou make in your algorithm
Understanding Tracks
Tracks are the range of instances each algorithm is evaluated across for a given challenge. Each track is defined by a unique set of parameters (e.g., size or complexity) that determine the difficulty and characteristics of the challenge instances.
For example, in the Boolean Satisfiability challenge, a track might be defined as n_vars=5000,clauses_to_variables_ratio=4267, which specifies the number of variables and the ratio of clauses to variables. In the Vehicle Routing challenge, a track might be n_nodes=600, specifying the number of nodes in the routing problem.
Where to find available tracks:
- Challenge PDFs: Each challenge’s PDF document (available in Challenges) describes all available tracks and their parameters
- TIG Dashboard: Visit the Challenges page on the main dashboard to see active tracks for each challenge
When testing your algorithm, you should test it across multiple tracks to ensure it performs well under different problem sizes and configurations. Benchmarkers are randomly assigned tracks per benchmark, so your algorithm needs to handle all available tracks for your chosen challenge.
Understanding Hyperparameters
Hyperparameters are algorithm-specific configuration settings that control how your algorithm explores the search space and finds solutions. Unlike tracks (which are defined by the challenge), hyperparameters are defined by you, the algorithm developer, and allow you to tune your algorithm’s behavior.
Examples of hyperparameters might include:
- Search depth or iteration limits
- Learning rates or step sizes
- Any other tunable parameters specific to your algorithm
Refer to your challenge’s template.rs file (located at tig-algorithms/src/<CHALLENGE>/template.rs) for how to define hyperparameters and the exact implementation details.
As you develop your algorithm, experiment with different hyperparameter values to find optimal settings for different tracks. You can pass hyperparameters as a JSON string when using test_algorithm.
When testing, pass null as the hyperparameters argument to use default settings. Document your hyperparameters in the help() function (see Adding Information for Benchmarkers below) so Benchmarkers can optimize their performance with your algorithm.
Developing Your Code
You should now edit my_first_algo.rs so it can actually find solutions.
The template.rs for each Challenge will tell you what function(s) needs implementing. This is typically solve_challenge, but is not always the case.
A good place to get started with developing your code is to copy what other Innovators have submitted for your Challenge and iterate upon it. The main branch contains algorithms that have been merged as they have been consistently adopted by Benchmarkers. You can find the code in:
https://github.com/tig-foundation/tig-monorepo/blob/main/tig-algorithms/src/<CHALLENGE>/<ALGORITHM>
Each algorithm has their own branch on tig-monorepo with name <CHALLENGE>/<ALGORITHM>. You can find the code in:
https://github.com/tig-foundation/tig-monorepo/blob/<CHALLENGE>/<ALGORITHM>/tig-algorithms/src/<CHALLENGE>/<ALGORITHM>
Saving Solutions
The save_solution function as detailed in template.rs is used to output solutions. You may call this function periodically to save your solution progress as your algorithm runs.
Algorithms that fail to produce a solution will be rejected. Use save_solution to ensure your algorithm always outputs a result.
Adding Information for Benchmarkers
The template.rs includes a help() function that you should implement to help Benchmarkers understand how to run and use your code optimally:
pub fn help() {
// Print help information about your algorithm here. It will be invoked with `help_algorithm` script
println!("No help information provided.");
}Use this function to document:
- Recommended hyperparameter settings for optimal performance
- Any other tips for getting the best results from your algorithm
Benchmarkers can view your help information by running:
help_algorithm <ALGORITHM>To check whether your code is competitive, you should test the code from other Innovators and compare scores.
Testing Existing Code Submissions
You will need to be inside the dev docker image:
list_algorithms
download_algorithm <ALGORITHM>
test_algorithm <ALGORITHM> <TRACK> <HYPERPARAMETERS>
help_algorithm <ALGORITHM>list_algorithms and download_algorithms have a --testnet option for targeting testnet instead of mainnet.
Use help_algorithm <ALGORITHM> to view helpful information provided by the algorithm’s author.
The avg_quality from test_algorithm is the average of the individual qualities of each nonce tested. It is displayed as a 6 digit integer, this translates to a 6 decimal place ratio.
100,000 -> 0.100000Once you are happy that your code is competitive, you can submit it to TIG
Making Your Submission
Adding README.md
You will need to copy the template.md for your challenge (e.g. tig-algorithms/src/satisfiability/template.md), rename to README.md, and fill in submission details:
## Submission Details
* **Challenge Name:** hypergraph
* **Algorithm Name:** [name of submission]
* **Copyright:** [year work created] [name of copyright owner]
* **Identity of Submitter:** [name of person or entity submitting the work to TIG]
* **Identity of Creator of Algorithmic Method:** [if applicable else null]
* **Unique Algorithm Identifier (UAI):** [if applicable else null]Tip: You can check against this regex
About UAI and Creator of Algorithmic Method:
- UAI stands for Unique Algorithm Identifier. Every Advance submission has a UAI that can be found in
tig-algorithms/advances/<CHALLENGE>/<ADVANCE>.md - If you are iterating on existing code, you should keep the same UAI and Identity of Creator of the Algorithmic Method
- If your code is based on a method you found outside of TIG, set the UAI to
null, and set Identity of Creator of the Algorithmic Method to the name(s) of the creators (if not known, also set tonull) - If your code is implementing a method described by an Innovator’s Advance submission, find and copy its UAI and Identity of Creator of the Algorithmic Method
- In the case of an error with a submission, code submissions with the same name and UAI can be submitted multiple times by the same in a given round.
Making a Submission
You will need 10 TIG to make a submission. It will be deducted from your Available Fee Balance. You can top up your balance on the Main Dashboard/Testnet Dashboard by selecting Top-Up TIG on the sidebar.
You can request for Testnet TIG on Base Sepolia Testnet using the TIG faucet.
To make your submission, head over to the Submission page under the Innovation tab.

Your submission can consist of multiple .rs files, multiple CUDA .cu files, and README.md.
Important for multiple .rs files:
- If you have multiple
.rsfiles,mod.rsmust be one of the files - The
solve_challengefunction must be defined inmod.rs - All other
.rsfiles must be imported inmod.rsusingmod other_file_name;
For example, if you have helper.rs and utils.rs in addition to mod.rs, your mod.rs should include:
mod helper;
mod utils;
pub fn solve_challenge(...) {
// Your implementation
}Select all your files and click on Submit Algorithm.
- Submissions are final and CANNOT BE MODIFIED after they are made.
- DO NOT MAKE SUBMISSIONS TO TESTNET IF YOU DON’T WANT IT TO BE PUBLIC