# RADIX Wiki — Full Content Export > This is the full-text version of llms.txt for https://radix.wiki > 588 pages, last generated 2026-02-25 ## ZeroCollateral URL: https://radix.wiki/community/zerocollateral Updated: 2026-02-25 Welcome to ZeroCollateral This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Radix Engine URL: https://radix.wiki/contents/tech/core-protocols/radix-engine Updated: 2026-02-25 The Radix Engine [ /ˈrædɪks ˈɛnʤɪn/ ] is a protocol-level runtime environment and application layer, analogous to the Ethereum Virtual Machine (https://ethereum.org/en/developers/docs/evm/) (EVM). The Radix Engine utilizes finite state machines (/contents/tech/core-concepts/finite-state-machines) to define permitted asset states, improving the developer experience for building decentralized applications (dApps) by removing asset logic to the protocol layer. Overview The Radix Engine is the core protocol component that executes transactions and maintains ledger state in the Radix network. It is responsible for sharding the ledger and processing transactions in a scalable, secure way. Some key aspects of the Radix Engine: - Implements a UTXO model customized for scalability and performance. - Breaks ledger state into substates that can be sharded across nodes. - Groups components and their owned resources into single substates for efficiency. - Uses a ceramic sharding model that can scale to an unlimited number of shards. - Executes single-shard transactions via consensus within each shard. - Processes multi-shard transactions via cross-shard coordination. - Supports transactions that specify intent rather than direct substate inputs/outputs. - Resolves conflicts from concurrent transactions by aborting one transaction. - Verifies validity of transactions via owner signatures rather than script conditions. - Provides inherent atomicity and isolation via the construction/destruction mechanism. The Radix Engine is able to process transactions in parallel across shards while maintaining a globally consistent ledger state. Concurrency is maximized by specifying intent and resolving conflicts with selective transaction abortion. Cross-shard coordination for multi-shard transactions is minimized based on the sharding model and construction/destruction mechanism. The Radix Engine architecture enables the linear scalability needed to support global finance applications with high transaction throughput. The Radix Engine is one of the few application layers compatible with the Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) consensus protocol. Game Engine Analogy The Radix Engine has been characterized as a game engine for DeFi (https://www.youtube.com/watch?v=fdG-JVT6bsc&t=1575s) in that it defines and enforces properties such as asset permanence, transfer logic, minting, burning and other properties as a physics engine (https://en.wikipedia.org/wiki/Physics_engine) simulates gravity, collisions, rotation and other physical properties for game assets. “[Building a high quality game] would have been utterly impossible without the game engine because they would have spent 90 percent of their time trying to reimplement the physics. Building DeFi is now a question of “What's the what's the functionality you want?”, not “Well let's start by defining what we mean by an asset or what we mean by a transfer; what we mean by a atomic swap; or something like that. All of that is just given by the Radix Engine.” - Matthew Hine (https://youtu.be/fdG-JVT6bsc?t=2363) Properties Asset Management The Radix Engine ensures that fungible and non-fungible assets are understood at the protocol level, rather than having to be implemented by developers as local smart contract applications. This means that tokens can be created by an API request without the need for a smart contract. Error Handling All illegal operations are automatically blocked. Object Permanence The Radix Engine enforces object permanence, such that a token that can only exist at one location and only be moved by agents with permission rights. Development History Radix Engine v1 V1 was released with Olympia (/contents/tech/releases/radix-mainnet-olympia) and introduced the basic functionality of transferring $XRD between wallets and staking / unstaking to validator nodes. Radix Engine v2 V2 was released on an early access basis with the Alexandria (/contents/tech/releases/radix-developer-environment-alexandria) developer environment and is scheduled for a full release with Babylon (/contents/tech/releases/radix-mainnet-babylon) . V2 enables developers to create blueprints and components for decentralized applications (dApps). Disadvantages of the Ethereum Virtual Machine (EVM) The Ethereum Virtual Machine (EVM) is a runtime environment that is used to execute smart contracts on the Ethereum blockchain. It is an important part of the Ethereum ecosystem, as it enables developers to create and deploy decentralized applications (dApps) that can interact with the Ethereum blockchain. However, the EVM has several disadvantages, including its complexity, its vulnerability to hacking, and its limitations in terms of scalability and performance. One major disadvantage of the EVM is its complexity. The EVM is a stack-based virtual machine, which means that it uses a data structure called a stack to store and manipulate data. This can make the EVM difficult to understand and use for developers who are not familiar with stack-based virtual machines. In addition, the EVM has a unique programming language called Solidity, which is used to write smart contracts for the Ethereum blockchain. Solidity is a relatively new and specialized language, which can make it difficult for developers to learn and use. Another disadvantage of the EVM is its vulnerability to hacking. The EVM is a decentralized and open platform, which means that anyone can access it and create smart contracts. However, this also means that the EVM is vulnerable to malicious actors who can create and deploy smart contracts that contain vulnerabilities or exploits. For example, in 2016, a hacker was able to exploit a vulnerability in the EVM to steal millions of dollars worth of Ethereum from the DAO, a decentralized autonomous organization that was built on the Ethereum blockchain. Finally, the EVM has limitations in terms of scalability and performance. The EVM is designed to execute smart contracts in a decentralized and trustless manner, which means that it must verify and execute every transaction on the Ethereum blockchain. This can lead to slow transaction times and high transaction fees, which can make it difficult for the EVM to handle a large number of transactions or to support high-volume dApps. In conclusion, the Ethereum Virtual Machine has several disadvantages, including its complexity, its vulnerability to hacking, and its limitations in terms of scalability and performance. These disadvantages highlight the need for continued development and improvement of the EVM, in order to make it more user-friendly, secure, and scalable. Further Reading - https://www.radixdlt.com/post/radix-engine-a-simple-secure-smart-contract-alternative (https://www.radixdlt.com/post/radix-engine-a-simple-secure-smart-contract-alternative) DEVELOPMENT Initial Release 2021-07-28 (https://www.radixdlt.com/post/radix-engine-a-simple-secure-smart-contract-alternative) Latest Release v2 (https://github.com/radixdlt/radixdlt/tree/main/radixdlt-engine) Code Repository github.com/radixdlt/radixdlt/tree/main/radixdlt-engine (https://github.com/radixdlt/radixdlt/tree/main/radixdlt-engine) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) ## Radix Context for AI Agents URL: https://radix.wiki/developers/infrastructure/radix-context Updated: 2026-02-25 Summary: Curated reference docs for AI coding agents to understand Scrypto, transaction manifests, and Radix dApp development. Type Developer Tooling Author xstelea (https://github.com/xstelea) Context Files 19 Domains Scrypto, Gateway API, Transactions, Wallet, Effect Standard agents.md (https://agents-md.org) Repository GitHub (https://github.com/xstelea/radix-context) Introduction radix-context (https://github.com/xstelea/radix-context) is a collection of 19 curated reference documents designed for AI coding agents — Claude, Cursor, Windsurf, and similar tools — to understand how to build dApps and Scrypto components on the Radix (/contents/tech/core-concepts/radix-engine) network. The project follows the agents.md standard (https://agents-md.org) , enabling compatible tools to auto-discover and load relevant documentation. Rather than requiring an AI agent to search scattered docs, radix-context provides pre-structured, deep-reference material that agents can ingest directly. This dramatically improves the quality of AI-generated Scrypto code and Radix transaction manifests. Installation One-line install into any project directory: curl -fsSL https://raw.githubusercontent.com/xstelea/radix-context/main/install.sh | bashThis copies 19 context files into a context/ directory and merges an AGENTS.md index file. Optionally run ./setup.sh to clone 7 companion repositories (Scrypto source, radix-dapp-toolkit, Effect, TanStack Router, and more) into .repos/ for full cross-reference. Context Files Radix DLT & Scrypto (9 files) - radix-AccessRule — Permission systems and role-based access control in Scrypto - radix-Account — Native account blueprint with 30 core methods - radix-Gateway — Effect-wrapped Gateway API client with retry logic - radix-GatewayRustSdk — Rust HTTP clients for Gateway and Core APIs - radix-Sbor — SBOR (/contents/tech/core-concepts/sbor) wire format and schema system - radix-SubIntents — Composable partial transactions for governance workflows - radix-TransactionManifest — Manifest construction and validation (V1/V2) - radix-radix-dapp-toolkit — Wallet integration, ROLA authentication, and signing - radix-transactions — Rust transaction building and signing Effect Library (8 files) - effect-Context — Dependency injection with Context.Tag - effect-Layer — Composable service blueprints with memoization - effect-Pipe — Piping utilities including pipe() and Effect.gen - effect-Platform — HTTP, filesystem, terminal, and networking abstractions - effect-Queue — Fiber-safe async queues with backpressure - effect-Rpc — Type-safe, transport-agnostic RPC - effect-Schema — Runtime validation and type-safe transformations - effect-atom — Reactive state bridging Effect and React Framework (2 files) - tanstack-Router — Type-safe routing with SSR and file-based routes - tanstackStart-ConsultationDapp — Full-stack dApp reference using React 19 and Effect Companion Repositories Running ./setup.sh clones 7 shallow repositories into .repos/ for full source-level context: - radixdlt-scrypto (https://github.com/radixdlt/radixdlt-scrypto) — Scrypto source code (develop branch) - radix-dapp-toolkit (https://github.com/radixdlt/radix-dapp-toolkit) — Official dApp toolkit - consultation_v2 (https://github.com/radixdlt/consultation_v2) — Consultation dApp reference implementation - radix-gateway-api-rust (https://github.com/ociswap/radix-client) — Rust Gateway API client (by Ociswap) - TanStack Router (https://github.com/TanStack/router) — Type-safe routing framework - Effect (https://github.com/Effect-TS/effect) — Functional TypeScript library External Links - radix-context — GitHub Repository (https://github.com/xstelea/radix-context) - agents.md Standard (https://agents-md.org) - Radix Official Documentation (https://docs.radixdlt.com) - Radix DLT on GitHub (https://github.com/radixdlt) ## Main URL: https://radix.wiki/community/main Updated: 2026-02-25 Welcome to Main This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## gilesmorris.me URL: https://radix.wiki/community/gilesmorris-me Updated: 2026-02-24 Welcome to gilesmorris.me This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Radix Developer Resources URL: https://radix.wiki/contents/resources/radix-developer-resources Updated: 2026-02-24 A selection of resources for developers building on Radix. ✅ Installation Applications on Radix can be prototyped quickly using the Radix Engine Simulator (resim). More information is available on the Radix Documentation (https://docs.radixdlt.com/docs/essentials) page. - Download and install VS Code (https://code.visualstudio.com/) . - Click on the Extensions sidebar icon and install the ‘Dev Containers’ extension. Follow the prompts to install Docker Desktop. - Make a new development folder, then navigate to it in VS Code via File > Open Folder. - Open a terminal window in VS Code via Terminal > New Terminal. - Run the following command to clone the official Scrypto Starter repository: - Copy git clone https://github.com/radixdlt/scrypto-starter.git - Navigate to the scrypto-starter folder via File > Open Folder. (IMPORTANT: don’t use ‘cd scrypto-starter’ at this stage) - VS Code should suggest you to open the repository in a Dev container. If so, follow the prompts and wait for the container to load. - If VS Code does not suggest a Dev Container: - Open the command palette via View > Command Palette. - Search for "Dev Containers: Reopen in Container" and click on it. Follow the prompts to log in or create a Docker Desktop account. - Run the following command to see the resim functionality: - Copy resim help - Skip the setup and navigate to the main body of the Step by Step Course (https://docs.radixdlt.com/docs/learning-to-run-your-first-scrypto-project#creating-an-account) . 🏁 Starter Pack The following resources are available to help you build and grow your application on Radix: - Learn Step by Step (https://docs.radixdlt.com/docs/learning-to-run-your-first-scrypto-project#creating-an-account) . - Deepen your understanding with Radix Technical Documentation (https://docs.radixdlt.com/docs) . - Get inspired by Official Code Examples (https://github.com/radixdlt/scrypto-examples) . - Play with the Transaction Manifest Builder (https://instruct.radixbillboard.com/) . - Dive into our resource-rich Discord (https://discord.com/channels/417762285172555786/968472959763243068) and Telegram (https://t.me/RadixDevelopers) channels. - Apply for the Radix Booster Grants (/contents/resources/radix-booster-grants) . 🛢️ General Resources Radix Wallet Scrypto Radix Engine Radix Gateway Radix dApps Radix Network ROLA Core Radix Node Docs Radix Wallet (/contents/tech/core-protocols/radix-wallet) Scrypto Docs (https://docs.radixdlt.com/docs/learning-step-by-step) Radix Engine Docs (https://docs.radixdlt.com/docs/radix-engine) Gateway Docs (https://radix-babylon-gateway-api.redoc.ly/) ROLA Docs (https://docs.radixdlt.com/docs/rola-radix-off-ledger-auth) Core Docs (https://radix-babylon-core-api.redoc.ly/) System Docs (https://radix-babylon-system-api.redoc.ly/) Repos iOS (https://github.com/radixdlt/babylon-wallet-ios) Android (https://github.com/radixdlt/babylon-wallet-android) Scrypto Repo (https://github.com/radixdlt/radixdlt-scrypto) Radix Engine Repo (https://github.com/radixdlt/radixdlt-scrypto) APIs Gateway API (https://mainnet.radixdlt.com/swagger/) Radix API (https://api.radixapi.net/docs) Radix Network APIs (https://docs.radixdlt.com/v1/docs/network-apis) Packages Radix Wallet Connector (https://chromewebstore.google.com/detail/radix-wallet-connector/bfeplaecgkoeckiidkgkmlllfbaeplgm) Scrypto Rust Crate (https://docs.rs/scrypto/latest/scrypto/index.html) Scrypto-test Rust Crate (https://docs.rs/scrypto-test/latest/scrypto_test/) Radix Engine Crate (https://docs.rs/radix-engine/latest/radix_engine/) ROLA Package (https://www.npmjs.com/package/@radixdlt/rola) Toolkits Scrypto Examples (https://github.com/radixdlt/official-examples/) / Community Examples (https://github.com/radixdlt/community-scrypto-examples) Radix Engine Toolkit (/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit) Radix dApp Toolkit (https://docs.radixdlt.com/docs/en/dapp-toolkit) SDKs Gateway API SDK (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) Core API SDK (https://www.npmjs.com/package/@radixdlt/babylon-core-api-sdk) ⿻ Scrypto Resources scrypto-math (https://github.com/ociswap/scrypto-math) scrypto-avltree (https://github.com/ociswap/scrypto-avltree) ice_rrc404v1 (https://github.com/aus87/ice_rrc404v1) ## UltraRare Bio URL: https://radix.wiki/ecosystem/ultrarare-bio Updated: 2026-02-24 Summary: DeSci team curating the comprehensive DeSci Wiki and producing the UltraRare podcast on decentralized science. Type DeSci Media & Research Focus DeSci Education, Curation, Advocacy Key Output DeSci Wiki, UltraRare Podcast Curated By Jocelynn Pearl (Ph.D.) Landscape By Dani Bergev, Katie Koczera Website ultrarare.bio (https://ultrarare.bio) Overview UltraRare Bio is a DeSci media and research team that produces two foundational resources for the decentralized science community: the DeSci Wiki (https://docs.google.com/document/d/1aQC6zn-eXflSmpts0XGE7CawbUEHwnL6o-OFXO52PTc/) — a comprehensive directory of 80+ DeSci projects, DAOs, and communities — and the UltraRare Podcast, covering developments across the DeSci ecosystem. The DeSci Wiki is curated by Jocelynn Pearl, Ph.D., with the Decentralized Science Landscape figure created by the UltraRare Bio team including Dani Bergev and Katie Koczera (graphic design). The wiki is community-maintained and updated approximately once per week, serving as the primary reference directory for the DeSci movement. DeSci Wiki The DeSci Wiki catalogs the entire DeSci ecosystem across categories including DeSci DAOs, decentralized science platforms, Science NFTs, decentralized biotech and biopharma, decentralized finance for science, focused research institutes, social communities (IRL, Discord, Telegram), and archived projects. It also tracks the DeSci landscape through a regularly updated visual map showing projects across financing, ecosystems, data storage and protocols, open science, science art and NFTs, and communities. The wiki has been a key community resource for tracking which projects are active, which have shut down, and how the ecosystem is evolving over time. External Links - DeSci Wiki — Full Project Directory (https://docs.google.com/document/d/1aQC6zn-eXflSmpts0XGE7CawbUEHwnL6o-OFXO52PTc/) - UltraRare Bio — Official Website (https://ultrarare.bio) ## Gitcoin DeSci Grants URL: https://radix.wiki/ecosystem/gitcoin-desci Updated: 2026-02-24 Summary: Quadratic funding platform that ran the first dedicated DeSci grants round, funding 82+ science projects. Type Funding Platform Focus Quadratic Funding for DeSci First DeSci Round GR15 GR15 DeSci Pool $567k+ (82 projects) GR15 Donors 2,309 Mechanism Quadratic Funding Website gitcoin.co (https://www.gitcoin.co) Overview Gitcoin (https://www.gitcoin.co) is the leading quadratic funding platform in the Ethereum ecosystem. Its Grants Round 15 (GR15) featured the first-ever dedicated DeSci Round (https://www.gitcoin.co/blog/gr15-results) , using quadratic funding to expand and distribute humanity's shared knowledge by reimagining the incentives, culture, and infrastructure for research using Web3 tools. Quadratic funding matches community donations using a formula that prioritizes the number of contributors over the size of contributions, ensuring broadly supported projects receive proportionally more matching funds. This democratizes science funding (/contents/tech/desci/desci-funding) in ways traditional grant committees cannot. DeSci Round Results The GR15 DeSci Round saw 2,309 donors crowdfund $67.9k to over 82 projects (https://www.gitcoin.co/blog/gr15-results) . With matching partners, the total pool exceeded $567k. Funded projects created new incentives for scientists to engage in open practices, validate research results outside the university system, build a culture of collective problem-solving, and develop infrastructure for a knowledge commons. Gitcoin's 2025 strategy focuses on distributing over $4.29 million across three major open-source software rounds, with Dynamic Quadratic Funding to adjust eligibility and caps for fairer distribution. While DeSci was not selected as a primary domain for GG24, the model established in GR15 demonstrated the viability of community-driven science funding. External Links - Gitcoin — Official Website (https://www.gitcoin.co) - DeSci: The Case for Decentralised Science — Gitcoin Blog (https://www.gitcoin.co/blog/desci-the-case-for-decentralised-science) - GR15 Results & Recap — Gitcoin Blog (https://www.gitcoin.co/blog/gr15-results) ## Foresight Institute URL: https://radix.wiki/ecosystem/foresight-institute Updated: 2026-02-24 Summary: Research organization advancing frontier science and technology since 1986, supporting longevity and nanotech. Type Research Organization Founded 1986 HQ San Francisco, CA Focus Areas Secure AI, Neurotech, Longevity, Nanotech, Space Notable Alumni David Baker (Nobel 2024), J. Fraser Stoddart (Nobel 2016) Annual Seminars 40+ Website foresight.org (https://foresight.org) Overview Foresight Institute has been advancing frontier science and technology since 1986 (https://foresight.org) , supporting pioneering scientists and innovators working on technologies too ambitious or interdisciplinary for legacy institutions. Based at 50 California Street in San Francisco, the institute operates at the intersection of DeSci and traditional research. The institute has a track record that includes hosting the first nanotech conference (1989), coining the term "open-source software" (1998), running early prediction markets (1994), and pioneering AGI dialogue (1997). Researchers supported by Foresight have won Nobel Prizes, including David Baker (2024, Chemistry) and J. Fraser Stoddart (2016, Chemistry) (https://foresight.org) . Programs & Focus Areas Foresight operates across six domains: Secure AI, Neurotechnology, Longevity Biotechnology, Nanotechnology, Space, and Existential Hope. Key programs include: - Vision Weekend — Flagship conferences gathering scientists, entrepreneurs, and policymakers - Grants — Supporting AI applications in science and safety - Prizes — Annual Feynman Prize (nanotechnology) and Hardy Prize (usable security) - Fellowship Program — Supporting early-career innovators - AI Nodes — Physical hubs launching in San Francisco and Berlin in early 2026 - Knowledge Dissemination — 40+ online seminars annually, podcasts, and reports External Links - Foresight Institute — Official Website (https://foresight.org) ## PsyDAO URL: https://radix.wiki/ecosystem/psydao Updated: 2026-02-24 Summary: Decentralized collective advancing psychedelic science through community-driven research funding and open IP. Type DeSci DAO Focus Psychedelic Research Grants Awarded 6 Total Funded $280k Governance Ostrom-inspired dual-token ($PSYC / $PSY) Parent Bio Protocol (/ecosystem/bio-xyz) Website psydao.io (https://www.psydao.io) Overview PsyDAO is a decentralized collective advancing psychedelic science through community-driven research funding and open intellectual property (https://www.psydao.io) . The organization promotes progress via distributed coordination, crowdfunding, and collective ownership of IP, ensuring that psychedelic medicine remains accessible to anyone. PsyDAO uses an Ostrom-inspired governance system (https://www.psydao.io) with two token classes: $PSYC holders govern research direction, while $PSY holders oversee operations. This dual-token model ensures both scientific rigor and operational accountability. Research Projects PsyDAO has awarded 6 grants totaling $280k across science, technology, and culture: Science: - Next Gen Empathogens — MDMA prodrug optimization - eDMT — Endogenous DMT research commons - Psychedelics X Longevity — LSD vs psilocin comparison study - DMXE — Dissociative drug development Technology: - BeeARD — Multi-agent AI research discovery system - OPSY — Token-incentivized data collection platform Culture: - KeneCafé — E-commerce platform for Amazonian nonprofit, bridging IP and traditional knowledge External Links - PsyDAO — Official Website (https://www.psydao.io) - Bio Protocol Ecosystem (https://www.bio.xyz) ## SCINET URL: https://radix.wiki/ecosystem/scinet Updated: 2026-02-24 Summary: Blockchain-powered cloud laboratory and publishing platform for decentralized scientific research. Type DeSci Platform Focus On-Chain Science Publishing & Cloud Lab Status Active Chain Internet Computer (ICP) Key Feature 40% faster peer review times Website scinet.one (https://scinet.one) Overview SCINET is a decentralized platform that fuses blockchain technology with scientific research and innovation (https://cryptorank.io/news/feed/bd1d8-exploring-the-capabilities-scinet-in-desci) , with a core mission of accelerating advancements in the life sciences. Built on the Internet Computer blockchain, it creates a holistic ecosystem that fosters collaboration, transparency, and accessibility in scientific research. The platform provides a blockchain-powered cloud laboratory that enhances traditional research setups with an intuitive interface that promotes good science, ensures regulatory compliance, and facilitates collaboration. Key Features Cloud Laboratory: Scientists can conduct experiments, record data, and manage research processes on a decentralized infrastructure, ensuring data integrity through immutable blockchain records. IP Documentation: Researchers can document their intellectual property on an immutable blockchain, ensuring discoveries and innovations are secure and properly credited regardless of storage or usage changes. Decentralized Publishing & Peer Review: SCINET offers a platform for publishing research findings and undergoing peer review, with blockchain-based peer review reducing review times by 40% (https://cryptorank.io/news/feed/bd1d8-exploring-the-capabilities-scinet-in-desci) and improving feedback quality through clear, verifiable records of reviewer contributions. Investor Funding: The platform connects researchers with untapped sources of investor funding, making it easier to finance innovative projects outside traditional grant systems. External Links - SCINET — Crypto Altruism Podcast Interview (https://www.cryptoaltruism.org/blog/crypto-altruism-podcast-episode91-scinet-how-blockchain-will-revolutionize-the-way-people-do-and-support-science) - Capabilities of SCINET in DeSci — CryptoRank (https://cryptorank.io/news/feed/bd1d8-exploring-the-capabilities-scinet-in-desci) ## DeSci Foundation URL: https://radix.wiki/ecosystem/desci-foundation Updated: 2026-02-24 Summary: Swiss non-profit advancing reliable, transparent, and reproducible science through decentralized infrastructure. Type Non-Profit Foundation Focus DeSci Coordination & Infrastructure Status Active Key Initiative DeSci Nodes (IPFS/dPID) Focus Areas Validation, Reproducibility, Funding, AI in Science, Mechanism Design Website descifoundation.org (https://descifoundation.org) Overview The DeSci Foundation is a non-profit advancing reliable, transparent, and reproducible science through open, decentralized systems (https://www.descifoundation.org/about) . Its mission is to explore and support improvements in the scientific ecosystem based on insights from meta-science and opportunities afforded by decentralized technologies. The foundation's focus areas include scientific validation, replicability and reproducibility of research, efficient funding mechanisms, artificial intelligence in science, pluralism, and incentives and mechanism design. DeSci Nodes The foundation's core technical initiative is DeSci Nodes, a platform for sharing original scientific research using decentralized identifiers (CIDs) built on IPFS and dPID. DeSci Nodes address critical problems in today's scientific infrastructure: link rot, content drift, data sharing barriers, storage costs, and monopolistic data silos. By using decentralized storage and persistent identifiers, DeSci Nodes ensure that scientific publications and datasets remain permanently accessible and verifiable — regardless of whether individual publishers or institutions continue to maintain their servers. External Links - DeSci Foundation — Official Website (https://www.descifoundation.org/) - The DeSci Movement — DeSci Foundation (https://www.descifoundation.org/future-of-science-contents/the-desci-movement) - DeSci Nodes (https://nodes.desci.com) ## GenomesDAO URL: https://radix.wiki/ecosystem/genomesdao Updated: 2026-02-24 Summary: Biotech DAO enabling private, user-owned genomic data sequencing and monetization via encrypted DNA vaults. Type DeSci DAO / Biotech Founded 2018 (Genomes.io) Focus Genomic Data Ownership & Privacy HQ London, UK Investors Pantera Capital, Modular Capital (June 2024) Token GENOME Technology AMD SEV-ES Encrypted Vaults Website genomes.io (https://www.genomes.io) Overview GenomesDAO, spearheaded by Genomes.io (https://www.genomes.io) (founded 2018, London), is a biotech DAO focused on the safe, private, and auditable monetization of genomic data. The platform combines AMD's SEV-ES encrypted virtualization, blockchain, and DeFi (https://genomes.gitbook.io/genomes.io-docs) to give individuals full ownership and control over their DNA data. GenomesDAO secured a new investment round led by Pantera Capital and Modular Capital in June 2024 (https://www.newswire.com/news/genomesdao-secures-new-investment-round-from-pantera-capital-and-22362025) , validating the growing intersection of genomic data and decentralized infrastructure. How It Works GenomesDAO works with trusted, industry-leading DNA sequencing providers to deliver clinical quality (30x) whole genome sequencing that analyses 100% of the genome. User DNA data is stored in fully encrypted "DNA Vaults" using AMD SEV-ES technology, ensuring that only the user controls access. Pharmaceutical companies and research organizations can query genomic data using GENOME tokens, with Genomes.io acting as a broker. Critically, users must approve each individual query of their vault — each query asks a specific question of the genome and only the answer to that question is released, preserving privacy while enabling research. External Links - Genomes.io — Official Website (https://www.genomes.io) - GenomesDAO Documentation (https://genomes.gitbook.io/genomes.io-docs) - GenomesDAO on X (https://x.com/GenomesDAO) ## AthenaBIO URL: https://radix.wiki/ecosystem/athenadao Updated: 2026-02-24 Summary: Decentralized community advancing women's health research, education, and funding through collaborative networks. Type DeSci DAO Focus Women's Health Research Status Active Active Projects 3 Key Partners Bio Protocol (/ecosystem/bio-xyz) , Molecule (/ecosystem/molecule) , VitaDAO (/ecosystem/vitadao) Chain Ethereum Website athenadao.co (https://www.athenadao.co) Overview AthenaBIO (formerly AthenaDAO) is a decentralized community advancing women's health research, education, and funding (https://www.athenadao.co) through collaborative global networks. The organization sources, funds, and incubates academic and industry R&D by connecting researchers with legal, strategic, scientific, and market support. AthenaBIO operates as a BioDAO within the Bio Protocol (/ecosystem/bio-xyz) ecosystem, utilizing blockchain technology and community-held tokens for transparent capital allocation in early-stage women's health research and biotech. Research Projects AthenaBIO has three active research projects, all focused on ovarian aging: - Novel targets to delay ovarian aging — In partnership with Gero, identifying new therapeutic targets - ISR pathway for ovarian follicle survival — Investigating the integrated stress response pathway - cGAS-STING pathway and senescence in ovarian aging — Studying inflammatory pathways in reproductive aging The organization has also published "The Data and Wearables Report" addressing data ownership, governance, and value in health technology. Key partners include Bio Protocol (/ecosystem/bio-xyz) , Molecule (/ecosystem/molecule) , VitaDAO (/ecosystem/vitadao) , ID Theory, Beaker DAO, The LAO, and Paramita VC. External Links - AthenaBIO — Official Website (https://www.athenadao.co) - Bio Protocol Ecosystem (https://www.bio.xyz) ## Bio Protocol URL: https://radix.wiki/ecosystem/bio-xyz Updated: 2026-02-24 Summary: DeSci's financial layer for launching and funding biotech DAOs, backed by Binance Labs and Pfizer Ventures. Type BioDAO Launchpad Founded 2022 Focus Biotech DAO Incubation & Funding BioDAOs Launched 11+ Bio Genesis Raise $33M+ (Nov 2024) BIO Token Market Cap Exceeded $200M (Nov 2023) Chain Ethereum Website bio.xyz (https://www.bio.xyz) Overview Bio Protocol (formerly Bio.xyz) describes itself as DeSci's new financial layer, engineered to commercialize the best science, faster (https://www.bio.xyz) . The platform incubates and funds specialized biotech DAOs (BioDAOs), each focused on a specific disease area or scientific domain, creating an ecosystem of decentralized research organizations. The platform raised over $33 million through Bio Genesis in November 2024 (https://www.bio.xyz) , with Binance Labs making its first DeSci investment. The BIO token market cap exceeded $200 million by November 2023. BioDAO Ecosystem Bio Protocol has launched 11+ specialized BioDAOs: - VitaDAO (/ecosystem/vitadao) — Longevity research (initiated July 2021) - AthenaBIO (/ecosystem/athenadao) — Women's health - PsyDAO (/ecosystem/psydao) — Psychedelic science - ValleyDAO — Synthetic biology - HairDAO — Hair loss research (filed first DAO scientific patent, Dec 2023) - CryoDAO — Cryopreservation - Cerebrum DAO — Brain health - Curetopia — Rare diseases - Long COVID Labs — Long COVID research - Quantum Biology DAO — Quantum biology - DermaLabs — Skincare science Key Products Ignition Sales: Low-cap fundraises for promising research that scale with community engagement and scientific milestones. BioXP: A loyalty points system earned through staking that grants early access to funding opportunities. BioAgents: A scientific AI agent platform for building decentralized scientific agents. External Links - Bio Protocol — Official Website (https://www.bio.xyz) - Bio Protocol $6.9M Raise Announcement — PR Newswire (https://www.prnewswire.com/news-releases/bio-protocol-raises-6-9m-to-launch-ai-native-decentralized-science-platform-for-biotech-funding--drug-discovery-302558983.html) ## Molecule URL: https://radix.wiki/ecosystem/molecule Updated: 2026-02-24 Summary: The platform that created IP-NFTs, enabling tokenized ownership and trading of biomedical research IP. Type DeSci Infrastructure Founded 2020 Focus IP-NFTs, Biotech IP Trading Active IP Tokens 46 Combined IPT Market Cap ~$16M All-time DEX Volume $61.4M Active Wallets 6,304 Chain Ethereum Website molecule.xyz (https://www.molecule.xyz) Overview Molecule is the infrastructure layer of DeSci, providing the IP-NFT (/contents/tech/desci/ip-nfts) framework that underpins most biotech DAOs. The platform enables trading of DeSci tokens and funding of biomedical research (https://www.molecule.xyz) by tokenizing intellectual property from scientific research into tradeable on-chain assets. Molecule's key innovation is attaching legal IP rights to NFTs, creating a new asset class that allows communities to co-own and govern research IP. The platform spans multiple scientific domains including longevity (29 projects, $9.4M market cap), brain health, women's health, psychedelics, cell therapy, and climate research. Products IP Token Trading: The core platform at labs.molecule.xyz (https://labs.molecule.xyz) enables trading of fractionalized IP tokens derived from IP-NFTs, with 296,533 total DEX transactions across 6,304 active wallets. IPNFT Explorer: A discovery tool for on-chain intellectual property, allowing users to browse and evaluate research IP. DeSci Screener: An IP token market analysis tool for evaluating biotech research assets. Documentation: Comprehensive guides at docs.molecule.to (https://docs.molecule.to) and educational resources at the Molecule learning hub. External Links - Molecule — Official Platform (https://www.molecule.xyz) - Molecule Documentation (https://docs.molecule.to) - Molecule on GitHub (https://github.com/moleculeprotocol) ## VitaDAO URL: https://radix.wiki/ecosystem/vitadao Updated: 2026-02-24 Summary: Community-owned collective funding early-stage longevity research through IP-NFTs and decentralized governance. Type DeSci DAO Founded July 2021 Focus Longevity Research Projects Funded 31 Capital Deployed $4.7M Companies Founded 3 IP Transactions 8 Chain Ethereum Website vitadao.com (https://www.vitadao.com) Overview VitaDAO is a community-owned collective dedicated to funding and advancing early-stage longevity science research (https://www.vitadao.com) . It was the first DeSci DAO, initiated in July 2021, and has since become the largest in the ecosystem. The organization aims to extend human healthspan and lifespan through accelerated research, open collaboration, and innovative funding structures. VitaDAO received a landmark $4.1 million investment from Pfizer Ventures in December 2022 (https://www.bio.xyz) , marking one of the first major pharmaceutical industry investments in a decentralized science organization. It operates as a BioDAO within the Bio Protocol (/ecosystem/bio-xyz) ecosystem. Research Portfolio VitaDAO has funded 31 projects (https://www.vitadao.com) across the drug development pipeline: 3 target discovery projects, 18 drug discovery projects, 3 preclinical drug development projects, and 3 clinical drug development projects. This has generated 8 intellectual property transactions and led to the founding of 3 companies, with a total project fully diluted valuation of $60 million. The DAO combines human expertise with AI agents — including AubrAI (longevity research specialist), Longevist AI for literature analysis, and a Biohacker Agent for compound formulation — to accelerate discovery in aging research. External Links - VitaDAO — Official Website (https://www.vitadao.com) - VitaDAO Discord (https://discord.gg/vitadao) - VitaDAO on X (https://x.com/vita_dao) ## DeSci on Radix URL: https://radix.wiki/contents/tech/desci/desci-and-radix Updated: 2026-02-24 Summary: How Radix's asset-oriented architecture could enable a new generation of DeSci applications. Type Ecosystem Opportunity Platform Radix Engine (/contents/tech/core-concepts/radix-engine) Language Scrypto (/developers/scrypto) Key Advantages Native assets, badges, atomic composability Status Greenfield opportunity Introduction Most DeSci infrastructure today is built on Ethereum, inheriting its account-based model and smart contract limitations. Radix's asset-oriented architecture (/contents/tech/core-concepts/radix-engine) offers a fundamentally different foundation for DeSci applications — one where resources (tokens, NFTs, data credentials) are first-class citizens in the execution environment rather than entries in a contract's internal ledger. This distinction matters for DeSci because the movement's core primitives — IP-NFTs, credential badges, DAO governance tokens, and research data attestations — are all assets. Building them on an asset-native platform eliminates entire categories of smart contract bugs that have plagued Ethereum-based DeSci projects. Badges as Scientific Credentials Radix's native badge system (/contents/tech/core-concepts/access-controller) provides a natural primitive for scientific credentials and peer review attestations. Unlike Ethereum's soulbound tokens (which are still smart contract entries), Radix badges are enforced at the engine level and can gate access to resources, functions, and governance actions. This enables verifiable, on-chain scientific reputation: a researcher's peer review history, publication record, or domain expertise could be represented as badges that unlock specific actions — like voting on funding proposals in their area of expertise, or accessing gated research data. Native NFTs for IP-NFTs Radix's native NFT standard means IP-NFTs would not depend on contract-level token standards (like ERC-721). The IP itself — licensing terms, royalty splits, governance rights — can be encoded directly in the NFT's non-fungible data, with the Radix Engine enforcing ownership and transfer rules without relying on external contract logic. Scrypto for DeSci DAOs Scrypto (/developers/scrypto) blueprints could model DeSci DAO governance with greater safety guarantees than Solidity. Resource-oriented programming means governance tokens are real assets that cannot be double-spent or reentrancy-exploited — addressing vulnerabilities that have affected Ethereum DAOs. Atomic composability (enhanced further by Xi'an's sharded Cerberus (/contents/tech/core-concepts/radix-roadmap) ) would allow complex DeSci transactions — like funding a research proposal, minting an IP-NFT, and distributing governance tokens — to execute atomically in a single transaction manifest, without the fragile multi-step approval flows required on other chains. External Links - Radix Documentation (https://docs.radixdlt.com/) - Radix Engine — Asset-Oriented Execution (/contents/tech/core-concepts/radix-engine) - Current DeSci Landscape — Ethereum.org (https://ethereum.org/en/desci/) ## Decentralized Science Funding URL: https://radix.wiki/contents/tech/desci/desci-funding Updated: 2026-02-24 Summary: How quadratic funding, token-curated grants, and biotech DAOs are transforming research funding. Type Funding Mechanisms Key Models Quadratic Funding, DAO Treasuries, IP-NFTs, Retroactive Public Goods Largest Round Gitcoin GR15 DeSci Round ($567k to 82 projects) Bio Genesis Raise $33M+ (November 2024) Problem Solved Multi-year grant delays, institutional bias, funding concentration Introduction Traditional scientific funding is dominated by a small number of government agencies and foundations, creating multi-year waiting periods, reviewer bias, and concentration of resources among senior researchers with conservative projects. DeSci introduces alternative funding mechanisms that are faster, more transparent, and more accessible to early-career researchers and unconventional research directions. These mechanisms range from community-driven DAO treasuries to algorithmic matching systems like quadratic funding, each addressing different failure modes in the traditional grant system. Quadratic Funding Quadratic funding, pioneered by Gitcoin (https://www.gitcoin.co) , matches community donations using a formula that prioritizes the number of contributors over the size of contributions. This means broadly supported projects receive disproportionately more matching funds, democratizing funding decisions. Gitcoin Grants Round 15 featured the first dedicated DeSci Round (https://www.gitcoin.co/blog/gr15-results) , with 2,309 donors crowdfunding $67.9k to over 82 projects. With matching partners, the total pool exceeded $567k — funding projects that create new incentives for open science practices and collective problem-solving. DAO Treasury Funding DeSci DAOs (/contents/tech/desci/desci-daos) maintain on-chain treasuries that fund research through governance-approved proposals. This model is faster than traditional grants (weeks instead of years) and more transparent (all funding decisions are on-chain). VitaDAO (/ecosystem/vitadao) has deployed $4.7M this way, while Bio Protocol's (/ecosystem/bio-xyz) Bio Genesis raise brought in over $33 million in November 2024, with Binance Labs making its first DeSci investment (https://www.bio.xyz) . IP-NFT Funding IP-NFTs (/contents/tech/desci/ip-nfts) create a direct funding channel where researchers tokenize their intellectual property and communities purchase governance tokens to co-own and fund that research. This aligns incentives — funders have upside if the research succeeds, and researchers retain more control than in traditional industry licensing. External Links - DeSci: The Case for Decentralised Science — Gitcoin Blog (https://www.gitcoin.co/blog/desci-the-case-for-decentralised-science) - Bio Protocol — DeSci Financial Layer (https://www.bio.xyz) - DeSci Funding Mechanisms — Ethereum.org (https://ethereum.org/en/desci/) ## DeSci DAOs URL: https://radix.wiki/contents/tech/desci/desci-daos Updated: 2026-02-24 Summary: How decentralized autonomous organizations are restructuring scientific research funding and governance. Type Organizational Model First DeSci DAO VitaDAO (/ecosystem/vitadao) (July 2021) Active BioDAOs 11+ via Bio Protocol (/ecosystem/bio-xyz) Domains Longevity, Women's Health, Psychedelics, Genomics, Brain Health, Rare Diseases Total Deployed $4.7M+ (VitaDAO alone) Governance Token-weighted voting, Ostrom-inspired models Introduction DeSci DAOs are decentralized autonomous organizations focused on funding and governing scientific research. Unlike traditional grant agencies or venture capital firms, DeSci DAOs enable communities to collectively allocate capital to research they believe in, using transparent on-chain governance to make funding decisions. The model emerged in 2021 with the launch of VitaDAO (/ecosystem/vitadao) , a community-owned collective funding longevity research. Since then, specialized DAOs have formed around women's health, psychedelics, genomics, brain health, rare diseases, synthetic biology, and more — many incubated through the Bio Protocol (/ecosystem/bio-xyz) (formerly Bio.xyz) launchpad. How DeSci DAOs Operate Treasury Management: DeSci DAOs maintain on-chain treasuries funded through token sales, donations, or investment rounds. VitaDAO (/ecosystem/vitadao) received a notable $4.1 million investment from Pfizer Ventures in December 2022 (https://www.bio.xyz) , validating the model in the eyes of traditional pharma. Governance: Most DeSci DAOs use token-weighted voting, where holding governance tokens grants proportional influence over funding decisions. Some, like PsyDAO (/ecosystem/psydao) , use more sophisticated Ostrom-inspired governance (https://www.psydao.io) with separate token classes for governance and operations. IP Ownership: Funded research IP is often held collectively via IP-NFTs (/contents/tech/desci/ip-nfts) , giving token holders governance rights over the resulting intellectual property. Major DeSci DAOs - VitaDAO (/ecosystem/vitadao) — Longevity research. 31 projects funded, $4.7M deployed, 3 companies founded. - AthenaBIO (/ecosystem/athenadao) — Women's health research, focusing on ovarian aging and reproductive biology. - PsyDAO (/ecosystem/psydao) — Psychedelic science. 6 grants totaling $280k across MDMA, DMT, and LSD research. - GenomesDAO (/ecosystem/genomesdao) — Genomic data ownership and privacy-preserving DNA sequencing. - ValleyDAO — Synthetic biology and biomanufacturing. - CryoDAO — Cryopreservation research. - Cerebrum DAO — Brain health and neuroscience. - HairDAO — Hair loss research; filed the first DAO scientific patent in December 2023. External Links - Bio Protocol — BioDAO Launchpad (https://www.bio.xyz) - DeSci DAOs — Ethereum.org (https://ethereum.org/en/desci/) - DeSci Wiki — Full DAO Directory (https://docs.google.com/document/d/1aQC6zn-eXflSmpts0XGE7CawbUEHwnL6o-OFXO52PTc/) ## IP-NFTs and IP Tokens URL: https://radix.wiki/contents/tech/desci/ip-nfts Updated: 2026-02-24 Summary: How intellectual property NFTs enable decentralized ownership and funding of scientific research. Type Financial Primitive Created By Molecule (/ecosystem/molecule) First Launched June 2023 (Newcastle University) Chain Ethereum IP Tokens Market Cap ~$16M (46 active IPTs) Total DEX Volume $61.4M all-time Introduction IP-NFTs (Intellectual Property Non-Fungible Tokens) are a financial primitive that attaches legal intellectual property rights to an on-chain NFT. Created by Molecule (/ecosystem/molecule) , they allow researchers and institutions to tokenize their IP — such as drug candidates, research data, or biotech patents — and use that tokenized IP to raise funding from decentralized communities. This solves a fundamental problem in academic research: traditionally, institutions own all IP generated by their researchers, and early-stage biotech IP is nearly impossible to fund through conventional venture capital due to long timelines and high risk. IP-NFTs create a new asset class that connects researchers directly with communities willing to fund high-risk, high-impact science. How IP-NFTs Work An IP-NFT binds a legal agreement (such as an exclusive license or assignment of IP rights) to an NFT on Ethereum. The NFT holder has legal rights to the underlying IP, verified through on-chain ownership. This enables transparent provenance, programmable royalties, and composable financial instruments built on top of research IP. The first IP-NFT from a university was launched in June 2023 from Newcastle University (https://www.bio.xyz) , and the first DAO-filed scientific patent followed in December 2023 from HairDAO (https://www.hairdao.xyz) . IP Tokens (IPTs) IP Tokens are fractionalized governance tokens derived from IP-NFTs. They allow communities to co-own and govern research IP collectively. As of early 2026, Molecule's platform lists 46 unique IP Tokens with a combined market cap of approximately $16 million (https://www.molecule.xyz) and $61.4 million in all-time DEX trading volume across 296,000+ transactions. Use Cases Longevity Research: VitaDAO (/ecosystem/vitadao) uses IP-NFTs to fund early-stage longevity research, having deployed $4.7 million across 31 projects with 8 IP transactions generated. Women's Health: AthenaBIO (/ecosystem/athenadao) funds research into ovarian aging and women's health through IP-NFT-backed grants. Psychedelics: PsyDAO (/ecosystem/psydao) funds research into MDMA prodrugs, endogenous DMT, and psychedelic-longevity intersections, with $280k deployed across 6 grants. IP-NFTs effectively create a two-sided marketplace: researchers get earlier, more accessible funding, while communities get governance rights over potentially valuable biotech IP. External Links - Molecule Documentation — IP-NFT Framework (https://docs.molecule.to) - Molecule — DeSci Token Trading Platform (https://www.molecule.xyz) - IP-NFTs in DeSci — Ethereum.org (https://ethereum.org/en/desci/) ## What is Decentralized Science (DeSci) URL: https://radix.wiki/contents/tech/desci/what-is-desci Updated: 2026-02-24 Summary: An overview of the DeSci movement applying Web3 tools to scientific research funding, publishing, and collaboration. Type Movement / Ecosystem Emerged ~2021 Built On Ethereum, IPFS, various L1/L2 chains Key Primitives DAOs, IP-NFTs, Quadratic Funding, DID Related Open Science, Open Access, Web3 Notable Orgs VitaDAO (/ecosystem/vitadao) , Molecule (/ecosystem/molecule) , Bio Protocol (/ecosystem/bio-xyz) Introduction Decentralized Science (DeSci) is a movement that aims to build public infrastructure for funding, creating, reviewing, crediting, storing, and disseminating scientific knowledge (https://ethereum.org/en/desci/) fairly and equitably using the Web3 stack. It builds on the broader open science movement by incorporating blockchain primitives — DAOs, NFTs, smart contracts, and decentralized identity — to restructure how scientific research is organized and incentivized. Traditional science faces systemic challenges: multi-year grant waiting periods, publication bias that suppresses negative results, journals that charge exorbitant fees for publicly funded research, and institutional IP ownership that limits researcher autonomy. DeSci proposes transparent, on-chain alternatives to each of these bottlenecks. Core Pillars Open Funding: DeSci replaces centralized grant committees with transparent mechanisms like quadratic funding (/contents/tech/desci/desci-funding) , DAO treasuries, and token-curated grants. Funding decisions happen on-chain, reducing bias and accelerating capital deployment to researchers. Open Publishing: Web3-native publishing models eliminate publisher gatekeeping. Researchers can publish findings on decentralized storage (IPFS, Arweave) with on-chain timestamping that establishes priority without waiting months for journal review cycles. IP Ownership: IP-NFTs (/contents/tech/desci/ip-nfts) allow researchers to retain ownership of their intellectual property, tokenize it for funding, and share upside with funders — replacing the traditional model where institutions own all generated IP. Peer Review: Token-based incentive systems compensate reviewers for their labor and create transparent reputation systems, addressing the current model where peer review is unpaid labor that primarily benefits publishers (https://ethereum.org/en/desci/) . Data Sharing: Decentralized storage ensures research data — including negative results and raw datasets — remains permanently accessible and censorship-resistant, countering the publication bias that suppresses unsuccessful experiments. DeSci vs Traditional Science The differences are structural. In traditional science, small centralized groups decide funding distribution; in DeSci, the public participates via DAOs (/contents/tech/desci/desci-daos) and quadratic donations. Traditional collaboration is limited by institutional affiliations; DeSci enables global, dynamic teams coordinating through on-chain governance. Traditional publishing follows slow, opaque pathways; DeSci offers new Web3-native models with embedded incentive mechanisms. Perhaps most significantly, DeSci changes who benefits from scientific output. Instead of publishers capturing value from publicly funded research, DeSci creates transparent value chains where researchers, funders, and the public all participate in the upside of discoveries. External Links - Decentralized Science (DeSci) — Ethereum.org (https://ethereum.org/en/desci/) - DeSci Foundation (https://www.descifoundation.org/) - Decentralized science: a new paradigm — Nature Biotechnology (https://www.nature.com/articles/s41587-023-01829-1) - DeSci Wiki — Comprehensive Project Directory (https://docs.google.com/document/d/1aQC6zn-eXflSmpts0XGE7CawbUEHwnL6o-OFXO52PTc/) ## Radix Namespace URL: https://radix.wiki/ideas/radix-namespace Updated: 2026-02-24 Overview A brand-new, open-source and decentralized name service namespace for Radix is available on Stokenet as a launchpad for the community to adopt. This is not a continuation of any existing name service in terms of both legal structure and functionality, it's a separate namespace intended to be operated independently and facilitated by the community in a "code-fork and migrate" like fashion as and when additional features are added in due course. Importantly, it was designed to be taken on and instantiated for mainnet by another community member or ecosystem project. The intention is a clear separation of stewardship and operation: the next operator becomes the instantiator and steward of the mainnet deployment and go-live process. Current Stokenet deployment: Package: package_tdx_2_1pkq94qsyttgk89fe4suh3xsq24adfeypd9s6pd0zramlt8cew4ctty Component: component_tdx_2_1cq3hzzgwypv3494aprg76c3pvxwpxmwalm7ld257pudj8urzc6l5ap Initial Code Org (not the active repo, but the genesis to be forked): https://github.com/radix-namespace (https://github.com/radix-namespace) Community-ready tooling (deployment is trivial - no build tools required) To make adoption as easy as possible, Wylie has built the tooling so deployment is essentially "turnkey." The handoff is designed so the next community operator can fork the repos, instantiate and configure the system with just a github account, very little dev knowledge, and without needing build tools, custom pipelines or a complex setup. The same goes for any future forks that may occur, due to several reasons. This effectively allows the namespace to live into perpetuity and opens up many options to enable the community to decide what features, styles of stewardship and mechanisms serves them best (at any time). Included tooling: - A management client that handles instantiation, configuration (including reserving ecosystem domains), inspection, and component verification. - A TypeScript SDK (plus docs) that makes it easy for others to build UIs and platform integrations. - GitHub repos with workflows designed to support a facilitator/project lead. The goal is simple: pick it up, run the management tooling, and deploy - the heavy lifting has already been done to make this repeatable and low-friction. Compatibility with existing name services While this is a new namespace, the mainnet instantiator can choose to mark existing domain NFT resources as "importable." This enables current holders to prove ownership and receive the equivalent domain within the new namespace. This also applies to future namespace migrations (during forks etc). Key features - Register human-readable Radix domains backed by bonds (stablecoins are the intended use case) - Create/edit/delete subdomains under any registered domain - Store arbitrary data records (wallet addresses, social links, metadata) - Reverse resolution by linking an account address to a primary domain - Domain import compatibility via approved domain NFT resources (prove ownership → receive equivalent namespace domain) - Third-party registrars can sign up, set a fee percentage, and earn fees per facilitated registration - Bond model (rebond/unbond): bonds aren’t burned; owners can swap token backing or return the domain to reclaim the bond - Admin can pre-allocate domains to specific accounts before the system goes live; anyone can pay on behalf of eligible claimants - Go-live is fully permissionless: admin badge is burned, permanently removing admin access - Each domain gets an isolated subregistry component and a dApp definition account for wallet recognition - Each domain NFT includes an on-ledger generated QR code SVG (no IPFS dependency) - Subregistries can update their associated domain resource address to support migration between providers ## Radix Namespace URL: https://radix.wiki/ecosystem/radix-namespace Updated: 2026-02-24 Summary: Permissionless .xrd domain name service on Radix with stablecoin-bonded domains, subdomains, reverse resolution, and third-party registrars. Introduction Radix Namespace is a permissionless, on-ledger domain name service built on the Radix network (/contents/tech/core-concepts/radix-network) . Developed from the ground up by community member Wylie and open sourced in February 2026 (https://t.me/RadixDevelopers) . It is implemented as an entiely new namespace (distinct from the now deprecated Radix Name Service) and is intended to be operated independently with optional compatibility paths for importing domains from other accepted domain-NFT resources. The system is currently deployed on Stokenet (Radix testnet). A key part of the design is that mainnet instantiation and stewardship must be taken on by another community member or ecosystem project (i.e., a separate operator becomes the instantiator and go-live steward). The "genesis" source code (https://github.com/radix-namespace) is fully open and organised as a GitHub organisation with core contracts and "turnkey" tooling, including a management client that guides deployment/configuration. Current Stokenet deployment: - Package: package_tdx_2_1pkq94qsyttgk89fe4suh3xsq24adfeypd9s6pd0zramlt8cew4ctty - Component: component_tdx_2_1cq3hzzgwypv3494aprg76c3pvxwpxmwalm7ld257pudj8urzc6l5ap - Genesis code org (to be forked): https://github.com/radix-namespace Features Radix Namespace provides a comprehensive naming system with the following capabilities: - Human-readable domains — register domains backed by bonds (stablecoins are the intended use case) - Subdomains — create, edit, and delete subdomains under any registered domain - Data records — store arbitrary records (wallet addresses, social links, metadata) against a domain - Reverse resolution — link an account address to a primary domain so others can discover a domain by address - Domain import compatibility — accepted domain NFT resources can prove ownership and receive the equivalent domain in this namespace - Third-party registrars — service providers can register as registrars, set a fee percentage, and earn fees on registrations they facilitate - Bond model (rebond/unbond) — bonds aren’t burned; owners can swap token backing (rebond) or return the domain to reclaim the bond (unbond) - Pre-allocation — admin can reserve domains for specific accounts before go-live; anyone can pay on behalf of an eligible claimant - On-ledger QR codes — each domain NFT includes a generated QR code stored entirely on-ledger as a lightweight SVG (no IPFS dependency) - Isolated subregistries — each domain gets its own subregistry component and a dApp definition account for wallet recognition Architecture The project is organised as a GitHub organisation (https://github.com/radix-namespace) containing several repositories, including: - Core Scrypto smart contracts — the namespace logic for registration, management, and resolution - TypeScript SDK — developer-friendly bindings for integrating the on-ledger components into apps and platforms - SDK documentation — reference docs for the SDK - Management client — a guided “wizard” style tool for instantiation, configuration (including reservations), inspection, and activation (burning admin badge to go live) Most repos include workflows intended to support facilitators/project leads, and the design allows the instantiator to define which external domain-NFT resources are “importable” for compatibility/migration purposes. Permissionless Design Once instantiated on mainnet, the admin badge is burned, permanently removing all admin access and making the system fully permissionless. No single entity can modify, censor, or control domain registrations after launch. Relationship to ILIS Radix Namespace is under community consideration to be integrated into the broader ILIS (I Like It Stable) (https://flux.ilikeitstable.com) ecosystem, built by octo.xrd (Radix Foundation member). Radix Namespace became a subject of ILIS orientated discussion in February 2026, with the fUSD stablecoin — a USD-pegged token created by borrowing against XRD collateral — being considered as a bond resource for domain registration. The bond model creates a natural synergy: domain registrations generate a use case for fUSD, while fUSD provides a decentralised stablecoin option for backing domain bonds without relying on centralised stablecoin issuers. External Links - GitHub Organisation — radix-namespace (https://github.com/radix-namespace) - ILIS (I Like It Stable) — fUSD Stablecoin (https://flux.ilikeitstable.com) - Telegram — Radix Developer Discussion (https://t.me/RadixDevelopers) Radix Namespace Type Name Service / Address Abstraction Status Deployed on Stokenet (testnet) Original Developer Wylie Bond Model USD stablecoins (bonded, not burned) GitHub github.com/radix-namespace (https://github.com/radix-namespace) Stokenet Package package_tdx_2_1pkq94qs...cew4ctty Ecosystem Part of ILIS (https://flux.ilikeitstable.com) ## Red URL: https://radix.wiki/community/red Updated: 2026-02-23 Welcome to Red This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Main Stef URL: https://radix.wiki/community/main-stef Updated: 2026-02-23 Welcome to Main Stef This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Viewer URL: https://radix.wiki/community/viewer Updated: 2026-02-23 Welcome to Viewer This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Pixela URL: https://radix.wiki/community/pixela Updated: 2026-02-23 Welcome to Pixela This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## senza info URL: https://radix.wiki/community/senza-info Updated: 2026-02-23 Welcome to senza info This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## octo URL: https://radix.wiki/community/octo Updated: 2026-02-23 Welcome to octo This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## RFC: Migrate Radix Developer Documentation to RADIX.wiki URL: https://radix.wiki/ideas/rfc-migrate-radix-developer-documentation-to-radixwiki Updated: 2026-02-23 TLDR: The current Radix developer docs are fragmented across 6+ domains. I’ve run 8 workshops onboarding developers to Radix and maintained RADIX.wiki since 2022. I’m offering to host all documentation for free on a Wikipedia-style platform where anyone can contribute and earn points for keeping it current. The site is live at radix.wiki Introduction The existing developer documentation for Radix is fragmented across docs.radixdlt.com, developers.radixdlt.com, academy.radixdlt.com, learn.radixdlt.com, radix-engine-docs.radixdlt.com, radix-core-api.redoc.ly and Github. This architecture and the tendency of documentation to become quickly outdated has made developer onboarding much less efficient than it could have been. My conviction is that developers are the primary customers of digital ledgers so in 2024 I organized the Radix Wiki Hackathon (radixdlt.com/blog/the-radix-wiki-hackathon) and have run eight ‘DappInADay’ (radix.wiki/contents/history) workshops at London universities since then. In planning those events I found it useful to create a Developer Resources (radix.wiki/contents/resources/radix-developer-resources) page on RADIX.wiki that the students could use as a reference in preparation for and during the events. Now, with the Radix Foundation dissolving, there is an opportunity to rationalize and improve the entire documentation and onboarding flow for the hordes of developers we’re expecting to descent on Xi’an. My proposal is to host that documentation on RADIX.wiki - a site that I’ve maintained since 2022. I built the wiki around July 2022 and received a Dandelion grant (radixdlt.com/blog/dandelions-program-update) to develop it in October of that year. My main rationale was that the only proven model for maintaining knowledge bases of this magnitude is distributed contribution with low friction, i.e. the Wikipedia approach. A comprehensive, up-to-date documentation of the Radix ecosystem can only be successful at scale with a Wikipedia-style knowledge base that anyone could contribute to. Not being a developer, I wasn’t able to develop the site as a Radix-native app, but with the advancements in AI I’ve finally been able to build what I originally envisioned. The staging site is radix-wiki.vercel.app and once there’s parity I’ll move it to RADIX.wiki. I’ve started to sketch out the developer docs here: radix.wiki/developers The Offer I’ll host the documentation indefinitely, at no cost to the community. I’ve been following Radix since 2017 and I’m building caper.network on Radix so I’m not going anywhere. I also have a vision to host more workshops and build an ETH Global / Superteam for Radix so it’s in my interest to make sure the documentation is accurate and effective. RADIX.wiki already has the infrastructure, the domain, and a track record of continuous operation. There are no hosting fees or grants needed. The documentation simply needs a home and community buy-in. The question of redirects from the existing documentation is up for debate but I’d argue that the current structure is such a mess that it would be better to add a blanket redirect rather than invest a lot of time and effort to preserve individual links. Functionality The wiki platform is designed to be as decentralized as possible. Users connect their Radix wallet to create and edit wiki pages organized in a hierarchical tag system. The app features a block-based content editor with support for rich text, code blocks with syntax highlighting, tables, tabs, YouTube/iframe embeds, and image uploads. Pages support revision history, threaded discussions, and user reputation scoring. Certain actions require minimum XRD token balances, creating an quasi-economic moderation system. Community members automatically get personal profile pages, and the platform tracks contribution statistics including pages created, edits made, and account age. Eventually I’ll work out a way of backing everything up to the Radix ledger or even hosting it on an RNS domain. Postscript - Governance & Long Term I’ve always planned on governing the wiki as a DAO on Caper with its own governance token. I also plan to honor the idea I sketched out here: t.me/EasyMoonDiscussion/5142, meaning that $EMOON will be exchangeable for the new governance token. Contributors currently earn points for maintaining and improving the documentation. These points recognize effort and create accountability, which aligns incentives: people who use the documentation have a direct path to improving it and being recognized for that work. The long-term objective of the DAO will be to raise its own funding on the Caper platform and thereafter run as a profitable entity so that it never has to seek external funding. A future Radix Global is part of that vision but any future plans will be decided by the DAO members and is beyond the scope of this immediate proposal. Why Not Github? So many reasons… - Not Radix native. - Not suitable for illustrated or interactive tutorials. - Unsuitable for beginners. - Even Dan Robinson (x.com/danrobinson/status/2006684676291965094) finds it too complicated. - Horrible UIX. - Limited embedding. - Can’t be governed as a DAO. What We Need From the Community - Consensus that community-maintained documentation is preferable to abandoned official documentation. - Content access to existing documentation sources (or permission to scrape/migrate). - Contributors willing to help with the initial migration and ongoing maintenance. Conclusion Poor documentation does more harm than good. RADIX.wiki was created for this situation. The infrastructure exists. The commitment exists. The model is proven across thousands of successful wikis. Let’s give Radix documentation a permanent, community-owned home. ## DAO Treasury Custody Setup URL: https://radix.wiki/ideas/dao-treasury-custody Updated: 2026-02-22 Summary: Establishing secure custody for community DAO treasury assets via PrimeVault or native multi-sig. Type Treasury Management Options PrimeVault, Native Multi-sig, Gnosis Safe (eXRD) Assets XRD + stablecoins Status Evaluating custody solutions Overview Before the Foundation can transfer assets to the community DAO entity, a secure custody solution must be in place. Three approaches are being evaluated in the RAC (https://t.me/RadixAccountabilityCouncil) : - PrimeVault — institutional custodian already used by the Foundation. RAC members would be added as authorized signers, then Foundation access revoked. - Native Radix Multi-sig — on-chain multi-signature wallet using Radix's access controller primitives. Two community designs exist: xStelea's multi-account signing approach and Don Marco's component-based preset-action model (https://radixtalk.com/t/dao-multisig-wallet/2164) . - Gnosis Safe (eXRD) — EVM multi-sig holding bridged eXRD via Hyperlane. Trivial to set up and zero cost, but introduces bridge dependency risk. Proposed PrimeVault Handover - Foundation completes remaining payments from treasury - Foundation adds RAC members as PrimeVault users with separate credentials - Foundation is removed from the PrimeVault account - RAC members become sole owners/admins The RAC favors this approach for its speed and security, with a longer-term migration to native multi-sig once the tooling matures. External Links - DAO Multisig Wallet — RadixTalk (https://radixtalk.com/t/dao-multisig-wallet/2164) - RAC Telegram Group (https://t.me/RadixAccountabilityCouncil) ## Market Making Transition URL: https://radix.wiki/ideas/market-making-transition Updated: 2026-02-22 Summary: Seven-step plan to transition XRD market making from Foundation management to community/third-party oversight. Type Financial Operations Transition Steps 7 Status Step 1-2 (RAC briefing) Models A: Standard, B: Community-Directed, C: Third-Party Manager Overview As part of the 2026 decentralization strategy, XRD market making operations must transition from Foundation control to community or third-party management. The Community Transition Planner (https://docs.google.com/spreadsheets/d/1PN1iOdHa9JJRbHgIkY0Mi4Sbyc5uz1qn053_gXL7IOs) outlines a seven-step process for this handover. Transition Steps - Assign MM Lead — RAC appoints a member with DeFi/CeFi experience to own the workstream - Foundation Briefing — RAC briefed on current MM arrangement (exchanges, spread targets, cost, performance) - Review Proposals — Evaluate incoming third-party MM management proposals - Choose Model — Option A: MM manages strategy (simplest); Option B: Community directs strategy (more control); Option C: Community hires a third-party manager (most oversight) - Allocate Capital — Determine how much treasury to dedicate to market making - Governance Proposal — Community ratifies chosen model and capital allocation via formal vote - Formal Onboarding — Community entity onboards with the MM; Foundation steps back External Links - 2026 Strategy: The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) ## Wallet Signal/Relay Endpoint Selection URL: https://radix.wiki/ideas/wallet-endpoint-selection Updated: 2026-02-22 Summary: User-selectable signaling server and relay endpoints in the Radix Wallet, mirroring gateway selection. Type Wallet Feature Status In Development Source Adam (CSO Radix Foundation), RAC General Mirrors Existing Gateway selection UI Overview The wallet team is updating the signal server and relay system to give users direct control over which endpoints they connect to — similar to the existing gateway selection feature in the Radix Wallet (/contents/tech/core-concepts/radix-wallet) . This is critical for the infrastructure decentralization (/roadmap/signaling-relay-rfp) effort. As the Foundation opens RFPs for independent operators to run Signaling Servers and Connect Relays, the wallet must support pointing to alternative providers rather than being hardcoded to Foundation endpoints. Impact - Multi-provider resilience: Users can switch providers if one goes down - Pre-transition readiness: Infrastructure decentralization can proceed before Foundation sunset - Competitive market: Enables commercial operators to offer differentiated service tiers External Links - RFPs for Gateway and Relay Services (https://www.radixdlt.com/blog/the-next-phase-of-decentralization-rfps-for-gateway-and-relay-services) ## Wallet Raw TxManifest Support URL: https://radix.wiki/ideas/wallet-txmanifest-support Updated: 2026-02-22 Summary: In-wallet pasting of raw transaction manifests for direct component interaction without Radix Connect. Type Wallet Feature Status In Development Source Adam (CSO Radix Foundation), RAC General Goal Remove dependency on Radix Connect Overview The Radix Wallet team is adding the ability to paste raw transaction manifests (/developers/reference/transaction-manifest) directly into the wallet, enabling users to interact with on-chain components without requiring the Radix Connect browser extension or a dApp frontend. This feature was announced by Adam (CSO, Radix Foundation) in the RAC Telegram group (https://t.me/RadixAccountabilityCouncil) on February 19, 2026, as part of efforts to reduce dependency on Foundation-operated infrastructure ahead of the 2026 transition. Impact - Decentralization: Users can submit transactions even if the Signaling Server or Connect Relay are temporarily unavailable - Power users: Enables direct component interaction for developers and advanced users - Emergency access: Provides a fallback transaction method independent of third-party infrastructure External Links - RAC Telegram Group (https://t.me/RadixAccountabilityCouncil) ## Hyperscale 500k TPS Public Test URL: https://radix.wiki/ideas/hyperscale-500k-test Updated: 2026-02-22 Summary: Public network test sustaining 500,000+ TPS with 590+ nodes on commodity hardware across 128 shards. Type Performance Milestone Date January 31, 2026 Sustained TPS 500,000+ Peak TPS 800,000+ Nodes 590+ Shards 128 Hardware Commodity (m6i.xlarge) Overview The Hyperscale public test (https://www.radixdlt.com/blog/hyperscale-update-500k-public-test-done) in January 2026 demonstrated the most significant throughput milestone in Radix history. The network sustained 500,000+ transactions per second with peaks exceeding 800,000 TPS, processing real DeFi-style swaps across 128 shards with full cross-shard atomic composability. Key Results - Linear scaling confirmed: 250k TPS on 64 shards → 500k TPS on 128 shards (exact proportional scaling) - Real workload: complex DeFi swap transactions, not simple transfers - Open participation: 590+ nodes from datacenters, desktops, and laptops - Commodity hardware: validators ran on AWS m6i.xlarge (4 cores, 16GB RAM) - Infrastructure: 384 bootstrap nodes, 40 validators, 6 load generators External Links - Hyperscale Update: 500k+ Public Test Done (https://www.radixdlt.com/blog/hyperscale-update-500k-public-test-done) - Interim Hyperscale: Closing the Chapter (https://www.radixdlt.com/blog/interim-hyperscale-closing-the-chapter) ## Proof of Personhood (idOS) URL: https://radix.wiki/ideas/proof-of-personhood Updated: 2026-02-22 Summary: On-chain identity verification badges via idOS integration for Sybil resistance and governance. Type Identity Verification Launched November 2025 Provider idOS Network Mechanism Badge-based (Radix native) Overview Proof-of-Personhood launched on Radix (https://www.radixdlt.com/blog/proof-of-personhood-launches-on-radix-with-idos) in November 2025 via integration with the idOS network. The system issues a Badge — a tokenized credential in the user's wallet — that proves uniqueness without exposing personal data. For developers, requiring a PoP badge is a single line of Scrypto code. For users, verification is a one-click wallet approval. The badge is critical for decentralized governance (Sybil resistance), reward programs (preventing gaming), and future permissioned DeFi applications. External Links - Proof-of-Personhood Launches on Radix with idOS (https://www.radixdlt.com/blog/proof-of-personhood-launches-on-radix-with-idos) - Radix idOS Proof-of-Personhood Portal (https://idos.radixdlt.com/) ## RedStone Oracle Integration URL: https://radix.wiki/ideas/redstone-oracles Updated: 2026-02-22 Summary: Institutional-grade price feeds with 1,200+ data sources live on Radix mainnet. Type Oracle Integration Launched June 2025 Feeds 1,200+ Provider RedStone Overview RedStone Oracles went live on Radix mainnet in June 2025, providing institutional-grade price feeds with over 1,200 available data sources. This integration gives Radix DeFi protocols access to reliable, tamper-resistant pricing data for lending, derivatives, and other financial applications. The Zellic security audit (https://www.radixdlt.com/blog/radix-review-2025) completed in May 2025 found zero critical or high-severity issues, providing additional confidence in the network's security posture for DeFi applications relying on oracle data. External Links - Radix Review 2025 (https://www.radixdlt.com/blog/radix-review-2025) ## Hyperlane Bridge Launch URL: https://radix.wiki/ideas/hyperlane-bridge-launch Updated: 2026-02-22 Summary: Permissionless cross-chain bridge connecting Radix to Ethereum, Arbitrum, Base, and BSC. Type Cross-chain Bridge Launched September 5, 2025 Assets USDC, USDT, wBTC, ETH Networks Ethereum, Arbitrum, Base, BSC Architecture Permissionless (Hyperlane) Overview Hyperlane went live on Radix (https://www.radixdlt.com/blog/hyperlane-is-live) in September 2025, delivering permissionless, modular cross-chain interoperability. The integration enables bridging of major assets (USDC, USDT, wBTC, ETH) from Ethereum, Arbitrum, Base, and BSC into the Radix DeFi ecosystem. The bridge is accessible via Astrolescent's Radix-native frontend and the Hyperlane Nexus reference app. Its permissionless architecture means new routes can be deployed without centralized approval. External Links - Hyperlane is Live! — Radix Blog (https://www.radixdlt.com/blog/hyperlane-is-live) - Astrolescent (https://astrolescent.com) - Hyperlane Nexus (https://nexus.hyperlane.xyz) ## Cuttlefish Protocol Upgrade URL: https://radix.wiki/ideas/cuttlefish-upgrade Updated: 2026-02-22 Summary: Mainnet protocol upgrade introducing native subintents for composable mini-transactions. Type Protocol Upgrade Released December 18, 2024 Key Feature Native Subintents Built On By Atomix, Bullring Overview The Cuttlefish protocol upgrade went live on mainnet in December 2024, introducing native subintents — modular mini-transactions that users can pre-sign and pass around. This enables new DeFi patterns like peer-to-peer trading and parallel auctions without intermediary smart contracts. Ecosystem Adoption - Atomix (https://www.radixdlt.com/blog/radix-subintents-in-action-peer-to-peer-trading-with-atomix) — peer-to-peer trading platform built entirely on subintents - Bullring — parallel auction system leveraging subintent composability Subintents represent a novel DeFi primitive unique to Radix, enabling use cases that are impossible or impractical on other L1s. External Links - Radix Subintents in Action: Peer-to-Peer Trading with Atomix (https://www.radixdlt.com/blog/radix-subintents-in-action-peer-to-peer-trading-with-atomix) ## Radix Engine Toolkit Update URL: https://radix.wiki/ideas/ret-scrypto-131-update Updated: 2026-02-22 Summary: Updating the Radix Engine Toolkit to support Scrypto 1.3.1 and modern Rust toolchains. Type Developer Tooling Dependency Scrypto 1.3.1 Status In Progress Overview The Radix Engine Toolkit (RET) is being updated to support Scrypto 1.3.1 (https://www.radixdlt.com/blog/scrypto-1-3-1-unlocking-modern-rust-support) and modern Rust toolchains. RET provides essential developer utilities for building and testing Scrypto components, transaction construction, and manifest building. This update ensures the full developer toolchain is compatible with Rust 1.92.0+ and the new WASM build pipeline introduced in Scrypto 1.3.1. External Links - Radix DLT GitHub (https://github.com/radixdlt) - Scrypto 1.3.1 Announcement (https://www.radixdlt.com/blog/scrypto-1-3-1-unlocking-modern-rust-support) ## Scrypto 1.3.1: Modern Rust Support URL: https://radix.wiki/ideas/scrypto-modern-rust Updated: 2026-02-22 Summary: Scrypto unlocks Rust 1.92.0+ support with a new WASM build pipeline, ending the 1.81.0 lockdown. Type Developer Tooling Version Scrypto 1.3.1 Released January 20, 2026 Rust Support Up to 1.92.0+ Previous Limit Rust 1.81.0 Overview Scrypto 1.3.1 (https://www.radixdlt.com/blog/scrypto-1-3-1-unlocking-modern-rust-support) removes the Rust 1.81.0 version lock that previously constrained Radix developers. The update resolves conflicts between modern WASM compiler optimizations and Radix Engine safety requirements by rebuilding the Rust standard library during package compilation, explicitly using the MVP WASM feature set. Developer Benefits - Access to Rust 1.92.0+ features, Clippy improvements, and formatting enhancements - Better IDE integration with modern Rust tooling - Easier integration with third-party no_std crates requiring updated Rust versions - Future-proof development environments The Radix Engine Toolkit is slated for updates to support Scrypto 1.3.1 compatibility. External Links - Scrypto 1.3.1: Unlocking Modern Rust Support (https://www.radixdlt.com/blog/scrypto-1-3-1-unlocking-modern-rust-support) ## Community Code Governance URL: https://radix.wiki/ideas/core-code-governance Updated: 2026-02-22 Summary: Transitioning maintenance of core Radix repositories — Node, Scrypto, Wallet — to community-led governance. Type Development Governance Priority P2 — High-use, active maintenance Repositories Node, Gateway, Scrypto, RET, Wallet, Connector Overview The Radix Foundation currently maintains all core code repositories including the Node software, Babylon Gateway, Scrypto language, Radix Engine Toolkit, Wallet (iOS/Android), and Connector Extension. Per the 2026 Strategy (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) , these will transition to community-led governance. During Phase 1, the Foundation continues security reviews and PR maintenance while the community designs long-term governance models. Eventually, community members and funded contributors will take ownership of code quality, security audits, and release management. Repositories in Scope - Radix Node Software - Babylon Gateway API - Scrypto programming language - Radix Engine Toolkit (RET) - Radix Wallet (iOS & Android) - Radix Connector Extension (Chrome) - Radix Ledger App - Supporting libraries and ROLA External Links - Radix DLT GitHub (https://github.com/radixdlt) - 2026 Strategy: The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) - Foundation Operational Stack (https://www.radixdlt.com/blog/the-foundation-operational-stack-mapping-the-2026-transition) ## P3 Community Services Handover URL: https://radix.wiki/ideas/p3-services-handover Updated: 2026-02-22 Summary: Transitioning non-critical Foundation services like Dashboard, RadQuest, and dev tools to community operators. Type Infrastructure Transition Priority P3 — Non-critical but maintained Services 15+ dApps, tools, and platforms Overview Beyond the critical P1 infrastructure (Gateway, Signaling, Relay), the Foundation's operational stack (https://www.radixdlt.com/blog/the-foundation-operational-stack-mapping-the-2026-transition) includes numerous P3 services that represent opportunities for community teams to adopt, improve, or replace. Services Available for Handover User-Facing - Asset service (icons for dApps/tokens) - Image service (Cloudflare compression/caching) - Token/NFT price service - Dashboard (transaction indexing; alternatives like Radxplorer exist) - RadQuest (onboarding dApp) - Gumball Club (demo application) - idOS Proof-of-Personhood integration - Radix Rewards system - Consultation dApp Developer Tooling - Radix dApp Toolkit - Dev Console & Sandbox - docs.radixdlt.com - academy.radixdlt.com (live but unmaintained) - learn.radixdlt.com knowledge base - Fullstack dApp example External Links - Foundation Operational Stack: Mapping the 2026 Transition (https://www.radixdlt.com/blog/the-foundation-operational-stack-mapping-the-2026-transition) ## Signaling & Relay Services RFP URL: https://radix.wiki/ideas/signaling-relay-rfp Updated: 2026-02-22 Summary: RFPs for decentralizing the wallet signaling server and mobile browser connect relay. Type Infrastructure RFP Services Signaling Server + Connect Relay Signaling Latency <300ms globally Peak Connections ~6,000 concurrent RFP Opened February 3, 2026 Overview Two critical infrastructure services are being transitioned from Foundation to community/commercial operators via RFPs opened February 3, 2026 (https://www.radixdlt.com/blog/the-next-phase-of-decentralization-rfps-for-gateway-and-relay-services) : - Signaling Server — establishes peer-to-peer WebRTC connections between the Radix Wallet mobile app and the Chrome desktop connector extension. Requires <300ms global latency, strict privacy (no payload inspection), and handling ~6,000 concurrent peak connections. - Connect Relay (RCR) — buffers encrypted messages between mobile browsers and the Radix Wallet app. Requires end-to-end encryption, aggressive rate limiting (600s TTL), and DDoS mitigation. External Links - RFPs for Gateway and Relay Services (https://www.radixdlt.com/blog/the-next-phase-of-decentralization-rfps-for-gateway-and-relay-services) - Foundation Operational Stack: Mapping the 2026 Transition (https://www.radixdlt.com/blog/the-foundation-operational-stack-mapping-the-2026-transition) ## Gateway Service Decentralization URL: https://radix.wiki/ideas/gateway-service-rfp Updated: 2026-02-22 Summary: RFP for independent operators to run the Babylon Gateway API that powers wallets and dApps. Type Infrastructure RFP Service Babylon Gateway Endpoint Database ~2TB PostgreSQL (60GB/mo growth) SLA Target 99.9% uptime, <1s latency RFP Opened February 3, 2026 Overview The Babylon Gateway is the primary API that aggregates ledger data into a queryable PostgreSQL database, enabling transaction submission and queries for the Radix Wallet (/contents/tech/core-concepts/radix-wallet) , Dashboard, and most ecosystem dApps. As part of the decentralization transition (https://www.radixdlt.com/blog/the-next-phase-of-decentralization-rfps-for-gateway-and-relay-services) , the Foundation is seeking one or more professional operators to take over this service. Requirements - Database: ~2TB with 60GB monthly growth - Uptime: 99.9% availability - Latency: <1 second query response globally - Revenue model: Operators must guarantee current usage levels for public services (Wallet, Dashboard) but may commercialize premium access for data aggregators, institutions, and high-volume dApps External Links - RFPs for Gateway and Relay Services (https://www.radixdlt.com/blog/the-next-phase-of-decentralization-rfps-for-gateway-and-relay-services) - Foundation Operational Stack (https://www.radixdlt.com/blog/the-foundation-operational-stack-mapping-the-2026-transition) ## Hyperlane Bridge Route Expansion URL: https://radix.wiki/ideas/hyperlane-bridge-expansion Updated: 2026-02-22 Summary: Expanding cross-chain warp routes to additional networks and assets based on community demand. Type Cross-chain Bridge Live Since September 5, 2025 Current Assets USDC, USDT, wBTC, ETH Current Networks Ethereum, Arbitrum, Base, BSC Pending Solana + additional routes Overview Hyperlane went live on Radix mainnet (https://www.radixdlt.com/blog/hyperlane-is-live) in September 2025, bringing permissionless, modular cross-chain interoperability. The integration enables bridging of USDC, USDT, wBTC, and ETH from Ethereum, Arbitrum, Base, and BSC into the Radix DeFi ecosystem. The bridge is accessible via Astrolescent (https://astrolescent.com) (Radix-native frontend) and Hyperlane Nexus (https://nexus.hyperlane.xyz) (reference app). New routes — including Solana — are being added based on community demand. Roadmap - Expand warp routes to additional networks based on community demand - Solana route deployment (announced as coming soon) - Permissionless architecture enables anyone to deploy new routes without centralized approval - Streamlined developer integration for building cross-chain dApps on Radix External Links - Hyperlane is Live! — Radix Blog (https://www.radixdlt.com/blog/hyperlane-is-live) - Astrolescent Bridge (https://astrolescent.com) - Hyperlane Nexus (https://nexus.hyperlane.xyz) ## Radix Rewards Season 2 URL: https://radix.wiki/ideas/radix-rewards-season-2 Updated: 2026-02-22 Summary: Community-proposed continuation of the Radix incentive program to drive on-chain activity. Type Incentive Program Predecessor Season 1 (114.3M XRD distributed) Status Temperature Check (74% support) S1 Impact Doubled weekly transactions & DEX volume Overview Following the success of Radix Rewards Season 1 (https://www.radixdlt.com/blog/radix-rewards-s1-distribution) , which distributed 114.3 million XRD and doubled on-chain transactions and DEX volume, the community has proposed Season 2 with 74% support in an initial RFC. The RAC has escalated this to a formal governance temperature check, with two active consultations: one defining the S2 reward pool and another specifying eligible activities. Season 1 Results - Total Distribution: 114,347,194 XRD (including S0 bonuses) - Vesting: 20% immediate, 80% over 7 days with Diamond Hands bonus - Impact: Doubled weekly on-chain transactions and DEX trading volume - Ecosystem metrics: CaviarNine $250M+ cumulative volume, Astrolescent $30M+ swap volume, 53,114+ deployed components External Links - Radix Rewards S1 Distribution (https://www.radixdlt.com/blog/radix-rewards-s1-distribution) - Rewards Portal (https://incentives.radixdlt.com/) ## Community DAO Formation URL: https://radix.wiki/ideas/community-dao-formation Updated: 2026-02-22 Summary: Establishing full on-chain governance via DAO or multi-sig for community-controlled funding allocation. Type Governance Structure Phase Phase 3 of 2026 Strategy Status Jurisdiction consultation active Tooling DAOpensource (Scrypto) (https://github.com/Stabilis-Labs/DAOpensource) Overview The final phase of Radix's 2026 decentralization strategy (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) envisions full community control of funding allocation and distribution via on-chain governance. Community members will submit proposals for funding in marketing, code, or business development, with the ecosystem voting on priorities. Transition Path Phase 1 (Current) — Foundation Administered Foundation maintains security, reviews PRs, and runs non-binding consultations to gauge community sentiment while governance structures are designed. Phase 2 — Community-Initiated RFPs Community proposes and votes on RFPs for marketing, partnerships, integrations, and incentive structures. Phase 3 — Full Decentralization On-chain governance (DAO/Multi-sig) controls allocation and distribution of funding. The Foundation's brand, trademarks, and IP transfer to community custody via blind trusts or DAO-controlled entities. External Links - 2026 Strategy: The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) - 2026 Strategy FAQ (https://www.radixdlt.com/blog/2026-strategy-faq) - DAOpensource — Scrypto DAO Package (https://github.com/Stabilis-Labs/DAOpensource) ## Consultations V2 dApp URL: https://radix.wiki/ideas/consultations-v2 Updated: 2026-02-22 Summary: Open-source two-phase governance dApp with temperature checks and formal proposals. Type Governance Tooling Status Live on Stokenet, open-sourced Open Source Feb 22, 2026 Components Scrypto + Backend + Frontend Overview Consultations V2 is the revamped governance dApp enabling community-led decision making on Radix. It introduces a two-phase voting structure: a Temperature Check phase to gauge whether a proposal should proceed, followed by a Formal Proposal phase for the binding vote. The V2 was launched on Stokenet in February 2026 and fully open-sourced on GitHub (https://github.com/radixdlt) on February 22, 2026, with Scrypto smart contracts, backend, and frontend code all available for community deployment. Active Proposals (Feb 2026) - Community DAO Jurisdiction — selecting legal jurisdiction for the community DAO entity - Radix Rewards Season 2 Pool — defining the token allocation for Season 2 - Season 2 Activities Framework — specifying eligible activities for earning rewards External Links - 2026 Strategy: The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) ## Governance Snapshot Asset Rules URL: https://radix.wiki/ideas/governance-snapshot-rules Updated: 2026-02-22 Summary: Community-approved definition of eligible assets for governance voting weight in consultations. Type Governance Standard Participation 1.4B XRD / 743 accounts Threshold 60% approval required Overview The Radix community completed a consultation (https://www.radixdlt.com/blog/consultation-results-defining-eligible-assets-for-snapshots) to define which assets count as valid voting power in governance snapshots. Over 1.4 billion XRD from 743 accounts participated. Approved Assets (≥60% Support) - XRD Staked on Any Validator — 69–79% support - Liquid XRD in Accounts — 69% support - XRD in Supported DEX Pools — 60% support (currently limited to Radix Rewards Season 1 pools) Rejected Assets (<60%) - Time-Locked XRD — 44% support - Lent XRD — 28% support External Links - Consultation Results: Defining Eligible Assets for Snapshots (https://www.radixdlt.com/blog/consultation-results-defining-eligible-assets-for-snapshots) ## Validator Subsidy Sunset URL: https://radix.wiki/ideas/validator-subsidy-sunset Updated: 2026-02-22 Summary: Phased wind-down of the Foundation validator subsidy from February through June 2026. Type Economic Policy Approval 71.6% (740.7M XRD) Participation 1.03B XRD / 604 accounts End Date June 2026 Overview The Radix community voted decisively (https://www.radixdlt.com/blog/consultation-results-the-future-of-the-validator-subsidy) to end the Foundation-administered validator subsidy, transitioning validators to fee-based sustainability. The plan tapers the subsidy over five months rather than halting it abruptly. Tapering Schedule Phase 1 — Foundation Administered - February 2026: Capped at 400k XRD or $350 USD (whichever is lower) - March 2026: Capped at 400k XRD or $200 USD (whichever is lower) Phase 2 — Community Administered - April–May 2026: Reduced to $100 USD paid in XRD Phase 3 — Conclusion - June 2026: Subsidy ends completely Impact Validators must transition to fee-based revenue models. The community recommends validators adjust fees, promote their nodes, and properly unregister if exiting. Stakers should diversify across multiple validators and monitor node performance during the transition period. External Links - Consultation Results: The Future of the Validator Subsidy (https://www.radixdlt.com/blog/consultation-results-the-future-of-the-validator-subsidy) ## Radix Accountability Council URL: https://radix.wiki/ideas/radix-accountability-council Updated: 2026-02-22 Summary: Five-member elected council guiding the transition from Foundation-led to community-led governance. Type Governance Body Members 5 (elected Feb 2026) Participation 1.34B XRD / 1,151 accounts Role Facilitators, not commanders Overview The Radix Accountability Council (RAC) was established in February 2026 via community consultation (https://www.radixdlt.com/blog/consultation-results-radix-accountability-council) to guide the transition from Foundation-led to community-led governance. Over 1.34 billion XRD from 1,151 unique accounts participated in the vote. Elected Members - Peachy — 1.248B XRD (92.76%) - Faraz — 1.129B XRD (83.87%) - Jazzer_9F — 1.116B XRD (82.92%) - Avaunt — 786.4M XRD (58.44%) - projectShift — 769.9M XRD (57.22%) All elected members received supermajority support, indicating broad community consensus. Responsibilities - Establish operational framework for the Council - Coordinate administrative requirements for the 2026 transition - Serve as primary interface between community will and Foundation execution - Facilitate (not direct) decision-making on RFPs, governance proposals, and infrastructure transition External Links - Consultation Results: Radix Accountability Council (https://www.radixdlt.com/blog/consultation-results-radix-accountability-council) - 2026 Strategy: The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) ## Multi-Factor Security Shield URL: https://radix.wiki/ideas/mfa-security-shield Updated: 2026-02-22 Summary: On-chain multi-factor authentication with phone, hardware, and social recovery factors for Radix accounts. Type Security Feature Status Phase 2 on Stokenet Phases 3 (Create → Update → Recover) Factors Phone, Ledger, Arculus, Mnemonic, Trusted Contact Mainnet After all phases tested Overview Multi-Factor Smart Accounts bring network-level multi-factor authentication to Radix, replacing single seed phrase control with configurable "Security Shields." Unlike account abstraction on other chains, Radix implements MFA directly at the protocol level via the Access Controller (/contents/tech/core-concepts/access-controller) , eliminating single points of failure. A Security Shield combines multiple authentication factors — phone, Ledger hardware wallet, Arculus Card, off-device mnemonic, or a trusted person — into a rule set that governs account access. Rollout Phases Phase 1 — Create & Sign (Live on Stokenet) Users can configure a Security Shield and apply it to accounts or personas. The wallet enforces shield-based signing rules. Launched October 2025 (https://www.radixdlt.com/blog/multi-factor-smart-accounts-arrive-on-stokenet-phase-1-is-live) . Phase 2 — Update Shield (Live on Stokenet) Users can modify existing shield configurations and exercise recovery-related functions including timed delays. Launched December 2025 (https://www.radixdlt.com/blog/radix-review-2025) . Phase 3 — Recovery Without Backup (Upcoming) The final phase enables account recovery when a primary device is lost or compromised, completing the full MFA lifecycle. Mainnet Timeline All three phases are being tested on Stokenet first. Once community feedback and testing is complete across all phases, the feature will be deployed to mainnet. The 2026 Strategy (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) lists MFA Phase 3 completion and mainnet launch as an immediate Phase 1 priority. External Links - MFA Phase 1 Announcement (https://www.radixdlt.com/blog/multi-factor-smart-accounts-arrive-on-stokenet-phase-1-is-live) - MFA Step-by-Step Rollout Guide (https://www.radixdlt.com/blog/introducing-multi-factor-smart-accounts-a-step-by-step-rollout-on-stokenet) ## Hyperscale-RS Community Implementation URL: https://radix.wiki/ideas/hyperscale-rs-community Updated: 2026-02-22 Summary: Community-led Rust implementation of the Hyperscale sharded consensus protocol. Type Community Development Language Rust Status Active Development Telegram hyperscale-rs for Radix (https://t.me/hyperscale_rs) Foundation Support Provisioned compute capacity Overview Hyperscale-RS is a community-led initiative to build a Rust implementation of the Hyperscale sharded consensus protocol. Following the Foundation's transition of Hyperscale development to the community (https://www.radixdlt.com/blog/interim-hyperscale-closing-the-chapter) , this project represents the primary path toward bringing sharded consensus to the Radix mainnet. The Radix Foundation has provisioned compute capacity for the team to run scaled tests, and the upcoming open-source release of the original Hyperscale codebase will provide additional reference material. Background The original Hyperscale implementation was developed under the Foundation's interim phase, achieving 500,000+ sustained TPS in public testing. With the 2026 strategy shifting to community-led development, the hyperscale-rs team — along with other community initiatives — is taking ownership of advancing the sharding roadmap. The team coordinates via the hyperscale-rs Telegram group (https://t.me/hyperscale_rs) (320+ members). External Links - hyperscale-rs Telegram Group (https://t.me/hyperscale_rs) - 2026 Strategy: The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) ## Hyperscale Open Source Release URL: https://radix.wiki/ideas/hyperscale-open-source Updated: 2026-02-22 Summary: Open-sourcing all Hyperscale code, documentation, and operational material for community reproduction and R&D. Type Open Source Release Status In Progress Performance 500k+ sustained TPS Peak TPS 800k+ Shards Tested 128 Nodes in Public Test 590+ Overview Following the completion of the interim Hyperscale phase (https://www.radixdlt.com/blog/interim-hyperscale-closing-the-chapter) , the Radix Foundation plans to open-source all remaining code, documentation, and operational material. This will enable the community to independently reproduce test results, verify performance claims, and build upon the codebase for future R&D. What Will Be Released - Source code — the full Hyperscale codebase used in the 500k+ TPS public test - Setup guides — Terraform and Ansible automation for deploying test networks - Network configuration — shard topology, validator setup, bootstrap node configs - Workload tooling — spam/load generation tools optimized for multi-core utilization - Reproducibility guidance — documentation for replicating the public test at scale - Dashboards and log files — observability tooling from the public test Test Results The January 2026 public test (https://www.radixdlt.com/blog/hyperscale-update-500k-public-test-done) sustained 500,000 TPS with peaks above 800,000 TPS. Transactions were real DeFi-style swaps executed across 128 shards on commodity AWS hardware (m6i.xlarge: 4 cores, 16GB RAM). The test confirmed linear scaling — doubling from 64 to 128 shards doubled throughput proportionally. The network included 384 bootstrap nodes, 40 validator nodes, and 6 load-generation nodes. External Links - Hyperscale Update: 500k+ Public Test Done (https://www.radixdlt.com/blog/hyperscale-update-500k-public-test-done) - Interim Hyperscale: Closing the Chapter (https://www.radixdlt.com/blog/interim-hyperscale-closing-the-chapter) ## Xi'an Protocol Upgrade URL: https://radix.wiki/ideas/xian-protocol-upgrade Updated: 2026-02-22 Summary: Fully sharded Cerberus consensus delivering infinite linear scalability and unlimited atomic composability. Type Protocol Upgrade Status Research / Testing Milestone Xi'an (4th major release) Predecessor Babylon (current mainnet) Key Feature Fully sharded Cerberus Test Network Cassandra Overview Xi'an is the fourth and final major milestone in the Radix roadmap (/contents/tech/core-concepts/radix-roadmap) , named after the ancient Chinese capital. It will implement the fully sharded form of the Cerberus consensus protocol (/contents/tech/core-protocols/cerberus-consensus) , delivering infinite linear scalability and unlimited atomic composability to the Radix network. Xi'an follows the progression from Olympia → Alexandria → Babylon → Xi'an (https://learn.radixdlt.com/article/what-is-the-radix-roadmap) , with each release adding fundamental capabilities. While Babylon (live since September 2023) introduced the full Radix stack (Scrypto, Radix Engine v2, Radix Wallet), Xi'an will unlock the network's ultimate scaling potential. Key Capabilities - Infinite linear scalability — adding more validator shards proportionally increases network throughput with no theoretical ceiling - Unlimited atomic composability — cross-shard transactions maintain full atomicity, preserving DeFi composability at any scale - Uncapped validator set — the current 100-validator cap is anticipated to be lifted, allowing unlimited validator participation - Commodity hardware — validators can run on standard datacenter or even desktop hardware, as demonstrated by Hyperscale testing on AWS m6i.xlarge instances (https://www.radixdlt.com/blog/hyperscale-update-500k-public-test-done) Current Progress Implementations of sharded Cerberus are being tested on the Cassandra test network. The Hyperscale public test (https://www.radixdlt.com/blog/hyperscale-update-500k-public-test-done) in January 2026 demonstrated 500,000+ sustained TPS across 128 shards on commodity hardware, validating the core scaling thesis. A community-led Rust implementation (hyperscale-rs) (https://t.me/hyperscale_rs) is also underway. No mainnet release date has been announced. The path to Xi'an depends on further Hyperscale R&D and community-driven development following the Foundation's 2026 transition strategy. External Links - Radix Roadmap — Knowledge Base (https://learn.radixdlt.com/article/what-is-the-radix-roadmap) - Cerberus Whitepaper (arXiv) (https://arxiv.org/pdf/2008.04450) - 2026 Strategy: The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) ## 3. Oracle Integration URL: https://radix.wiki/developers/infrastructure/03-oracle-integration Updated: 2026-02-22 Summary: Bring off-chain data on-ledger for Scrypto components using push and pull oracle patterns. Oracle Integration Pattern Type External Data Difficulty Intermediate–Advanced Concepts Oracles, Price Feeds, Cross-Ledger Data Challenge Scrypto Oracles Challenge (https://radixdlt.medium.com/scrypto-oracles-challenge-is-live-the-radix-blog-radix-dlt-6b958c4d9948) Example Provider RedStone on Radix (https://blog.redstone.finance/2025/06/12/redstone-brings-secure-gas-efficient-oracle-solutions-to-radix-defi-ecosystem/) Introduction Scrypto components execute deterministically on the Radix Engine (/contents/tech/core-protocols/radix-engine) and can only access data already present on the ledger — they cannot make HTTP calls or read external APIs. When a dApp needs external data (asset prices, exchange rates, weather data, random numbers), it requires an oracle: a service that bridges off-chain information to on-ledger state. Oracle design is one of the most challenging aspects of decentralised application development. This article covers the two main oracle patterns on Radix, their trade-offs, and how to integrate existing oracle providers. Push Oracles A push oracle is an off-chain service that periodically submits transactions to update on-ledger price data. A Scrypto component stores the latest values and exposes a read method that other components call. Architecture - An off-chain relayer monitors data sources (exchange APIs, aggregator feeds). - The relayer signs and submits a transaction calling update_price(asset, price, timestamp) on the oracle component. - The oracle component, protected by an admin badge (/developers/scrypto/03-authorization-and-badges) , stores the update in an on-ledger vault or key-value store. - Consumer components call get_price(asset) to read the latest value. Trade-offs - Pros: Simple consumer integration, predictable data freshness, works with any data type. - Cons: Relayer must pay transaction fees for every update, data staleness between updates, single-point-of-failure if the relayer goes offline. Mitigations Include a timestamp field in each price update and check staleness in consumer logic: reject prices older than a threshold (e.g. 5 minutes). Use multiple independent relayers with a median/aggregation mechanism to reduce trust assumptions. Pull Oracles A pull oracle delivers data within the consumer's transaction rather than pre-posting it on-ledger. The oracle provider signs the data off-chain, and the consumer's transaction manifest includes both the signed data payload and a verification call to the oracle contract. Architecture - The consumer's frontend fetches a signed price attestation from the oracle provider's API. - The frontend constructs a transaction manifest (/contents/tech/core-protocols/transaction-manifests) that first calls the oracle component's verify_and_store(signed_data) method, then calls the consumer component's business logic. - The oracle component verifies the provider's signature and exposes the data for the remainder of the transaction. Trade-offs - Pros: Data is always fresh (fetched at transaction time), no relayer infrastructure needed, consumer pays the gas. - Cons: More complex frontend integration, requires the oracle provider to run a signing API, consumer must handle the multi-step manifest. RedStone on Radix RedStone (https://blog.redstone.finance/2025/06/12/redstone-brings-secure-gas-efficient-oracle-solutions-to-radix-defi-ecosystem/) is an example of a pull oracle that has integrated with Radix. Its price feeds are delivered as signed payloads that Scrypto components verify on-chain, treating the data as a first-class resource within the asset-oriented (/contents/tech/core-concepts/asset-oriented-programming) model. External Links - Scrypto Oracles Challenge — Radix Blog (https://radixdlt.medium.com/scrypto-oracles-challenge-is-live-the-radix-blog-radix-dlt-6b958c4d9948) - RedStone on Radix — RedStone Blog (https://blog.redstone.finance/2025/06/12/redstone-brings-secure-gas-efficient-oracle-solutions-to-radix-defi-ecosystem/) - Will Radix Offer Decentralized Data Storage? — Radix Blog (https://www.radixdlt.com/blog/will-radix-offer-decentralized-data-storage) ## 2. Radix APIs URL: https://radix.wiki/developers/infrastructure/02-radix-apis Updated: 2026-02-22 Summary: Compare the Gateway, Core, Engine State, and System APIs and choose the right one for your use case. Radix APIs Difficulty Intermediate Est. Time 15 minutes Prerequisites None Language TypeScript / HTTP Gateway Spec Gateway API Reference (https://radix-babylon-gateway-api.redoc.ly/) Core Spec Core API Reference (https://radix-babylon-core-api.redoc.ly/) API Overview Radix exposes four APIs at different abstraction levels. Most developers only need the Gateway API. API Purpose Best For Access Gateway High-level ledger queries and transaction submission dApps, wallets, explorers Public endpoints Core Low-level transaction and state data from nodes Exchanges, analytics, integrators Private / self-hosted Engine State Complete current ledger state at engine level Advanced state queries Self-hosted (requires node config) System Node health and diagnostics Node operators Local only Gateway API The Gateway API (https://radix-babylon-gateway-api.redoc.ly/) is the primary interface for dApp developers. It provides entity state queries, transaction submission, stream queries, and historical state access — all via JSON POST requests with cursor-based pagination. Official Endpoints Network URL Mainnet https://mainnet.radixdlt.com Stokenet https://stokenet.radixdlt.com These are rate-limited by IP. For production dApps, use a third-party provider or run your own Gateway (/developers/infrastructure/01-running-a-node) . Key Endpoint Groups - /state/* — entity details, resource balances, metadata, vault contents - /transaction/* — submit, preview, and check transaction status - /stream/transactions — query committed transactions with filters - /status/gateway-status — current ledger state version and epoch Use the Gateway SDK (/developers/frontend/02-gateway-sdk) (@radixdlt/babylon-gateway-api-sdk) for typed TypeScript access. Core API The Core API (https://radix-babylon-core-api.redoc.ly/) gives direct access to a node's view of the ledger — transaction-level detail, raw state, and streaming committed transactions. It includes an LTS (Long-Term Support) sub-API designed for exchange integrators. LTS Endpoints - /lts/transaction/submit — submit a notarized transaction - /lts/transaction/status — check transaction status by intent hash - /lts/state/account-all-fungible-resource-balances — all token balances for an account - /lts/transaction/construction — get current epoch for building transactions The Core API requires running your own node. Third-party providers include Grove (https://portal.grove.city) (free tier: 5M requests/day) and NowNodes (https://nownodes.io) . Engine State and System APIs Engine State API Provides complete current state at the Radix Engine abstraction level. More comprehensive than Core API state queries but requires explicit node configuration (available since node v1.1.3.1). Useful for advanced state inspection without running a full Gateway. System API Node diagnostics only — health checks, connection info, performance metrics. Used for monitoring, not application development. Which API Should I Use? Use Case API Frontend dApp reading account state Gateway SDK (/developers/frontend/02-gateway-sdk) Submitting transactions from dApp Gateway (via dApp Toolkit (/developers/frontend/01-radix-dapp-toolkit) ) Exchange integration (deposits, withdrawals) Core API (LTS endpoints) Indexing / analytics pipeline Core API (streaming) Node health monitoring System API External Links - Gateway API reference (https://radix-babylon-gateway-api.redoc.ly/) - Core API reference (https://radix-babylon-core-api.redoc.ly/) - Gateway API providers (/developers/legacy-docs/integrate/network-apis/gateway-api-providers) - Core API providers (/developers/legacy-docs/integrate/network-apis/core-api-providers) - API comparison (Learn Radix) (https://learn.radixdlt.com/article/what-apis-are-relevant-to-the-radix-public-network) ## 1. Running a Radix Node URL: https://radix.wiki/developers/infrastructure/01-running-a-node Updated: 2026-02-22 Summary: Set up a Radix full node or validator using Docker or systemd on Ubuntu. Running a Radix Node Difficulty Intermediate Est. Time 60 minutes Prerequisites Linux server with SSH access Language Bash Official Docs Node Introduction (https://docs.radixdlt.com/docs/node-introduction) Overview A Radix node connects to the peer-to-peer network, syncs ledger state, and (if registered as a validator) participates in Cerberus consensus (/contents/tech/core-concepts/cerberus-consensus) . You can run a node to power your own API endpoints (/developers/infrastructure/02-radix-apis) , support the network, or earn staking emissions (/contents/tech/core-concepts/network-emissions) as a validator. Hardware Requirements Resource Minimum Recommended (Validators) CPU 4 vCPU 8 vCPU (e.g., AWS c5.2xlarge) RAM 16 GB 32 GB+ Storage 500 GB SSD 1 TB+ NVMe OS Ubuntu 22.04 LTS (Jammy Jellyfish) Network Port 30000/tcp open for gossip Validators need reliability Validators should run on dedicated hardware or high-availability cloud instances. Downtime reduces your share of emissions (/contents/tech/core-concepts/network-emissions) and may cause delegators to move their stake. Installation Radix provides two installation methods: Option A: CLI-Guided Install (Recommended) The babylonnode CLI tool walks you through configuration interactively: wget -O babylonnode https://github.com/radixdlt/babylon-nodecli/releases/latest/download/babylonnode-ubuntu22.04 chmod +x babylonnode sudo mv babylonnode /usr/local/bin/ # Run guided setup babyonnode docker installOption B: Docker Manual Setup For more control, use docker-compose directly with the official images from radixdlt/babylon-node (https://github.com/radixdlt/babylon-node) . Option C: Systemd Run the node process directly on the host, managed by systemd. Better for servers shared with other services. All methods are documented at docs.radixdlt.com/docs/node-setup (https://docs.radixdlt.com/docs/node-setup) . Becoming a Validator Once your node is synced, you can register it as a validator to participate in consensus and earn emissions. - Register — submit a validator registration transaction (costs 5-30 XRD in fees) - Configure — set your validator name, website URL, fee percentage, and delegation policy - Attract stake — the top 100 validators by total delegated stake are active each epoch - Maintain uptime — missed proposals reduce your emissions share See the validator setup guide (https://docs.radixdlt.com/docs/validator-setup) for detailed steps. Monitoring Use the standard Prometheus + Grafana stack for observability: - Install Node Exporter for system metrics (CPU, RAM, disk, network) - Configure Prometheus to scrape your node's metrics endpoint - Set up Grafana dashboards to visualize node health, sync status, and proposal statistics The node also exposes a System API for diagnostics — connection status, health checks, and performance metrics. See Radix APIs (/developers/infrastructure/02-radix-apis) for details. Maintenance - Protocol updates — update your node software when new protocol versions are released; validators must signal readiness - Backups — the ledger can be rebuilt from the network, but backing up your validator key is critical - Snapshots — daily ledger snapshots are available at snapshots.radix.live (https://snapshots.radix.live) for faster initial sync External Links - Node introduction (https://docs.radixdlt.com/docs/node-introduction) - Node setup guide (https://docs.radixdlt.com/docs/node-setup) - Validator setup (https://docs.radixdlt.com/docs/validator-setup) - Maintenance guide (https://docs.radixdlt.com/docs/node-maintenance) - babylon-node GitHub (https://github.com/radixdlt/babylon-node) ## 3. ROLA Authentication URL: https://radix.wiki/developers/frontend/03-rola-authentication Updated: 2026-02-22 Summary: Authenticate users off-ledger using the Radix Off-Ledger Auth challenge-response protocol. ROLA Authentication Difficulty Intermediate Est. Time 25 minutes Prerequisites Radix dApp Toolkit (/developers/frontend/01-radix-dapp-toolkit) Language TypeScript NPM @radixdlt/rola (https://www.npmjs.com/package/@radixdlt/rola) What is ROLA? ROLA (https://docs.radixdlt.com/docs/rola-radix-off-ledger-auth) (Radix Off-Ledger Authentication) is a challenge-response protocol that lets your backend verify a user owns a Radix account or persona — without submitting any on-ledger transaction. It's the Radix equivalent of "Sign-In with Ethereum" but uses the native Radix Wallet. How It Works - Backend generates a challenge — a random 32-byte hex string stored with a 5-minute expiry - Frontend requests proof — sends the challenge to the Radix Wallet via requestProofOfOwnership() - User approves in wallet — wallet signs the challenge with the account's private key (Ed25519 or secp256k1) - Frontend sends proof to backend — contains public key, signature, and curve type - Backend verifies — checks the signature, confirms the public key matches the account's owner_keys metadata on ledger, and deletes the challenge Implementation 1. Generate Challenges (Backend) import crypto from 'crypto' // Store challenges with expiry (use Redis, DB, or in-memory Map) const challenges = new Map() function createChallenge(): string { const challenge = crypto.randomBytes(32).toString('hex') challenges.set(challenge, { expires: Date.now() + 5 * 60 * 1000 // 5 minutes }) return challenge } 2. Request Proof (Frontend) import { DataRequestBuilder } from '@radixdlt/radix-dapp-toolkit' // Fetch challenge from your backend const challenge = await fetch('/api/auth/challenge').then(r => r.json()) // Request proof of account ownership const result = await rdt.walletApi.sendRequest( DataRequestBuilder.accounts() .atLeast(1) .withProof(challenge) ) 3. Verify Proof (Backend) import { Rola } from '@radixdlt/rola' import { RadixNetwork } from '@radixdlt/babylon-gateway-api-sdk' const rola = Rola({ networkId: RadixNetwork.Mainnet, applicationName: 'My dApp', dAppDefinitionAddress: 'account_rdx...', expectedOrigin: 'https://my-dapp.com' }) // Verify the signed challenge const result = await rola.verifySignedChallenge({ challenge: challengeHex, proof: { publicKey, signature, curve }, address: accountAddress, type: 'account' }) if (result.isOk()) { // Create session (JWT, cookie, etc.) }Delete used challenges Always delete a challenge after verification — successful or not. This prevents replay attacks where a captured proof is resubmitted. When to Use ROLA Use Case Auth Method User login / session creation ROLA Prove account ownership to backend ROLA Transfer assets or call components On-ledger transaction Gate content by badge ownership ROLA + Gateway query ROLA proves identity. On-ledger transactions perform actions. For token-gated access, verify ownership via ROLA then query the account's resources via the Gateway SDK (/developers/frontend/02-gateway-sdk) . External Links - ROLA documentation (https://docs.radixdlt.com/docs/rola-radix-off-ledger-auth) - @radixdlt/rola on npm (https://www.npmjs.com/package/@radixdlt/rola) - ROLA examples repository (https://github.com/radixdlt/rola-examples) ## 2. Gateway SDK: Reading Ledger State URL: https://radix.wiki/developers/frontend/02-gateway-sdk Updated: 2026-02-22 Summary: Query account balances, component state, and transaction history using the Radix Gateway SDK. Gateway SDK Difficulty Intermediate Est. Time 20 minutes Prerequisites Radix dApp Toolkit (/developers/frontend/01-radix-dapp-toolkit) Language TypeScript NPM @radixdlt/babylon-gateway-api-sdk (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) Overview The Gateway SDK (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) is a TypeScript client for the Radix Gateway API (/developers/infrastructure/02-radix-apis) . It lets you query account balances, read component state, look up transaction history, and submit transactions — all the read/write operations your dApp frontend needs. npm install @radixdlt/babylon-gateway-api-sdk Setup import { GatewayApiClient, RadixNetwork } from '@radixdlt/babylon-gateway-api-sdk' const gatewayApi = GatewayApiClient.initialize({ networkId: RadixNetwork.Mainnet, // or RadixNetwork.Stokenet applicationName: 'My dApp', applicationVersion: '1.0.0', applicationDappDefinitionAddress: 'account_rdx...' })The SDK connects to the Radix Foundation's public Gateway by default. For production dApps with high traffic, consider running your own Gateway node (/developers/infrastructure/02-radix-apis) or using a third-party provider. Common Queries Account Balances const details = await gatewayApi.state.getEntityDetailsVaultAggregated( 'account_rdx1c956...' // account address ) // details.fungible_resources — token balances // details.non_fungible_resources — NFT holdingsComponent State const component = await gatewayApi.state.getEntityDetailsVaultAggregated( 'component_rdx1cp...' // component address ) // Read state, vaults, metadataEntity Metadata const metadata = await gatewayApi.state.getAllEntityMetadata( 'resource_rdx1t...' // any entity address ) // Returns: name, symbol, description, icon_url, etc. Transaction Status const status = await gatewayApi.transaction.getStatus( 'txid_rdx1...' // intent hash from sendTransaction ) // status.intent_status: "CommittedSuccess" | "CommittedFailure" | "Pending" | ...Historical State Query state at a specific point in time by passing at_ledger_state: const historicalDetails = await gatewayApi.state.innerClient.stateEntityDetails({ stateEntityDetailsRequest: { addresses: ['account_rdx1...'] at_ledger_state: { state_version: 12345678 } } }) External Links - @radixdlt/babylon-gateway-api-sdk on npm (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) - Gateway API reference (https://radix-babylon-gateway-api.redoc.ly/) - Mainnet Gateway Swagger UI (https://mainnet.radixdlt.com/swagger/) ## 1. Radix dApp Toolkit URL: https://radix.wiki/developers/frontend/01-radix-dapp-toolkit Updated: 2026-02-22 Summary: Connect a web application to the Radix Wallet and interact with on-ledger components using the Radix dApp Toolkit. Building a Frontend dApp Difficulty Intermediate Est. Time 60–90 minutes Prerequisites Node.js, a deployed Scrypto package (/developers/getting-started/03-deploying) Language TypeScript / JavaScript Key Library @radixdlt/radix-dapp-toolkit (https://www.npmjs.com/package/@radixdlt/radix-dapp-toolkit) Scaffolding npx create-radix-app@latest Official Docs Building a Frontend dApp (/developers/legacy-docs/build/build-dapps/dapp-application-stack/building-a-frontend-dapp) Introduction A Radix dApp consists of two layers: on-ledger blueprints (/contents/tech/core-concepts/blueprints-and-packages) deployed as Scrypto packages, and an off-ledger frontend that connects to the user's Radix Wallet (/contents/tech/core-protocols/radix-wallet) to sign and submit transactions. The Radix dApp Toolkit (RDT) (https://github.com/radixdlt/radix-dapp-toolkit) is the official TypeScript library that handles wallet connection, session management, data requests, and transaction submission. This guide covers integrating RDT into a web application. Project Setup The fastest way to start is the official scaffolding tool: npx create-radix-app@latest This generates a project with RDT pre-configured, the Connect Button wired up, and example transaction code. Alternatively, add RDT to an existing project: npm install @radixdlt/radix-dapp-toolkit Initialising the Toolkit Create an RDT instance with your dApp's configuration: import { RadixDappToolkit, RadixNetwork } from '@radixdlt/radix-dapp-toolkit' const rdt = RadixDappToolkit({ dAppDefinitionAddress: 'account_rdx...', // your dApp definition account networkId: RadixNetwork.Stokenet, // or RadixNetwork.Mainnet applicationName: 'My dApp', applicationVersion: '1.0.0', }) The dAppDefinitionAddress is a Radix account that you own, registered as your dApp's identity through the Developer Console (https://console.radixdlt.com/) metadata settings. This is how the wallet identifies your application to the user. Wallet Connection & Authentication RDT provides the √ Connect Button, a framework-agnostic web component that handles the full wallet connection flow: When a user clicks the button, RDT coordinates with the Radix Wallet Connector (https://chromewebstore.google.com/detail/radix-wallet-connector/bfeplaecgkoeckiidkgkmlllfbaanlkl) browser extension to establish a connection to the user's mobile wallet. The user authenticates using a Persona — a reusable identity that can share selected accounts and personal data with your dApp. Requesting Account Data Configure what data your dApp needs at connection time: import { DataRequestBuilder } from '@radixdlt/radix-dapp-toolkit' rdt.walletApi.setRequestData( DataRequestBuilder.accounts().atLeast(1), DataRequestBuilder.persona().withProof(), ) This asks the user to share at least one account address and a cryptographic proof of Persona ownership. For ROLA (Radix Off-Ledger Authentication) (/contents/tech/core-protocols/rola-authentication) , provide a challenge generator that fetches a 32-byte hex challenge from your backend: rdt.walletApi.provideChallengeGenerator(async () => { const res = await fetch('/api/auth/challenge') return (await res.json()).challenge }) Submitting Transactions Radix transactions are built using transaction manifests (/contents/tech/core-protocols/transaction-manifests) — a declarative syntax that describes what the transaction should do. Your dApp sends a manifest stub to the wallet; the wallet completes it by adding fee payment and any user-specified assertions. const result = await rdt.walletApi.sendTransaction({ transactionManifest: ` CALL_METHOD Address("component_rdx...") "buy_token" Decimal("100") ; CALL_METHOD Address("${accountAddress}") "deposit_batch" Expression("ENTIRE_WORKTOP") ; `, }) The user reviews the transaction in their wallet — seeing exactly which assets move where — signs it, and the wallet submits it to the network. Your dApp receives a transaction hash that you can track via the Gateway API (/contents/tech/core-protocols/radix-gateway-api) . Reacting to Wallet Data Subscribe to wallet state changes to update your UI in real time: rdt.walletApi.walletData$.subscribe((walletData) => { const accounts = walletData.accounts // Update UI with connected accounts }) External Links - Radix dApp Toolkit — GitHub (https://github.com/radixdlt/radix-dapp-toolkit) - @radixdlt/radix-dapp-toolkit — npm (https://www.npmjs.com/package/@radixdlt/radix-dapp-toolkit) - Building a Frontend dApp — Official Docs (https://docs.radixdlt.com/docs/building-a-frontend-dapp) - Run Your First Frontend dApp — Official Docs (https://docs.radixdlt.com/docs/learning-to-run-your-first-front-end-dapp) - Radix Wallet SDK — GitHub (https://github.com/radixdlt/wallet-sdk) ## 2. Transaction Lifecycle URL: https://radix.wiki/developers/transactions/02-transaction-lifecycle Updated: 2026-02-22 Summary: Understand how transactions are built, signed, notarized, and submitted to the Radix network. Transaction Lifecycle Difficulty Intermediate Est. Time 20 minutes Prerequisites Transaction Manifest Language (/developers/transactions/01-manifest-language) Language Manifest / TypeScript Official Docs Transactions (/developers/legacy-docs/reference/transactions/transactions) Transaction Structure A Radix transaction (https://docs.radixdlt.com/docs/transaction-structure) has three layers, each wrapping the previous: Layer Contains Purpose Intent Header + manifest Defines what to do and when Signed Intent Intent + signatures Authorizes the transaction Notarized Transaction Signed intent + notary signature Final submission-ready payload The header specifies the validity window (epoch range, ~5 min per epoch), the notary's public key, and a nonce. The manifest contains the instructions. Build → Sign → Submit 1. Build Construct the transaction intent with a manifest and header. The epoch window determines how long the transaction remains valid — typically the current epoch ± 2. 2. Sign One or more parties sign the intent hash with their private keys (Ed25519 or ECDSA secp256k1). For wallet-submitted transactions, the dApp Toolkit (/developers/frontend/01-radix-dapp-toolkit) handles this automatically. 3. Notarize The notary signs the signed intent hash. In most dApps, the wallet acts as both signer and notary. 4. Submit The compiled payload is sent to the network via the Gateway or Core API (/developers/infrastructure/02-radix-apis) . 5. Track Poll the transaction status using the intent hash. Outcomes: - Committed — accepted on ledger (success or application-level failure) - Permanently rejected — invalid structure or expired epoch - Temporarily rejected — may become valid later (e.g., pending dependency) Subintents and Pre-Authorization Subintents (/developers/legacy-docs/reference/transactions/subintents) are independent mini-transactions that compose into a single atomic transaction. Each subintent has its own manifest, header, and signers. Why Subintents? - Delegated fee payment — a dApp pays fees on behalf of the user, solving the "double onboarding" problem (user needs XRD before they can do anything) - Peer-to-peer trading — buyer and seller each sign a subintent; a matchmaker combines them into an atomic swap - Flash loans — a lending subintent provides liquidity, the user's subintent uses it, and a payback subintent returns it — all atomically Key Instructions # Parent defines a child subintent DEFINE_CHILD SubintentHash("deadbeef..."); # Parent yields to child YIELD_TO_CHILD SubintentHash("deadbeef...") Expression("ENTIRE_WORKTOP"); # Child yields back to parent YIELD_TO_PARENT Expression("ENTIRE_WORKTOP"); Pre-Authorization Flow Pre-authorization (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) lets users sign subintents in advance. The dApp can later combine them with its own subintents and submit whenever ready. - dApp sends a pre-auth request to the wallet with a subintent manifest - User reviews and signs in the wallet - Wallet returns the signed subintent to the dApp - dApp combines it with other subintents and submits as a complete transaction Practical example A user pre-authorizes "swap 100 USDC for GBP" via a subintent. The dApp adds a fee-payment subintent and submits the combined transaction — the user never needs to hold XRD for fees. External Links - Transactions overview (https://docs.radixdlt.com/docs/transactions) - Transaction structure (/developers/legacy-docs/reference/transactions/transaction-structure) - Subintents (/developers/legacy-docs/reference/transactions/subintents) - Pre-authorization flow (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) ## 1. Transaction Manifest Language URL: https://radix.wiki/developers/transactions/01-manifest-language Updated: 2026-02-22 Summary: Write human-readable transaction manifests to move assets, call components, and compose multi-step operations. Transaction Manifest Language Difficulty Intermediate Est. Time 25 minutes Prerequisites Scrypto Fundamentals (/developers/scrypto/01-fundamentals) Language Manifest Official Docs Transaction Manifest (https://docs.radixdlt.com/docs/transaction-manifest) What is a Transaction Manifest? A transaction manifest (https://docs.radixdlt.com/docs/transaction-manifest) is a human-readable script that describes exactly what a transaction does — which assets move where, which components are called, and in what order. Unlike EVM bytecode, manifests are transparent: the Radix Wallet (/contents/tech/core-protocols/radix-wallet) summarizes them in plain language so users never "blind sign." Manifests use a bash-like syntax where each instruction is a command followed by typed arguments and a semicolon. Instructions execute sequentially — if any fails, the entire transaction rolls back atomically. The Worktop Every transaction has a worktop — a temporary holding area for resources in transit. When you withdraw tokens from an account, they land on the worktop. You then take them into named buckets and pass them to methods or deposit them. # Withdraw 10 XRD — tokens land on worktop CALL_METHOD Address("account_rdx...") "withdraw" Address("resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("10") ; # Take from worktop into a named bucket TAKE_FROM_WORKTOP Address("resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("10") Bucket("xrd_bucket") ; Core Instructions Instruction Purpose CALL_METHOD Call a method on a component (e.g., withdraw, deposit, swap) CALL_FUNCTION Call a function on a blueprint (e.g., instantiate) TAKE_FROM_WORKTOP Take a specific amount from worktop into a bucket TAKE_ALL_FROM_WORKTOP Take all of a resource from worktop into a bucket ASSERT_WORKTOP_CONTAINS Assert minimum amount on worktop (slippage protection) CREATE_PROOF_FROM_BUCKET_OF_AMOUNT Create an authorization proof from a bucket Value Types - Address("resource_rdx...") — entity addresses - Decimal("10.5") — numeric amounts - Bucket("name") — named bucket references - Proof("name") — named proof references - Expression("ENTIRE_WORKTOP") — deposit everything remaining Common Patterns Simple Token Transfer CALL_METHOD Address("account_rdx_sender...") "lock_fee" Decimal("5"); CALL_METHOD Address("account_rdx_sender...") "withdraw" Address("resource_rdx1tk...xrd...") Decimal("100") ; TAKE_ALL_FROM_WORKTOP Address("resource_rdx1tk...xrd...") Bucket("tokens"); CALL_METHOD Address("account_rdx_recipient...") "try_deposit_or_abort" Bucket("tokens") Enum<0u8>() ;DEX Swap with Slippage Protection CALL_METHOD Address("account_rdx...") "withdraw" Address("token_a_addr") Decimal("100") ; TAKE_ALL_FROM_WORKTOP Address("token_a_addr") Bucket("input"); CALL_METHOD Address("dex_component_addr") "swap" Bucket("input") ; # Ensure minimum output — reverts if not met ASSERT_WORKTOP_CONTAINS Address("token_b_addr") Decimal("95"); CALL_METHOD Address("account_rdx...") "deposit_batch" Expression("ENTIRE_WORKTOP") ;Atomic composability You can chain calls to multiple components in one manifest. If a DEX swap feeds into a lending protocol deposit, both succeed or both fail — no partial state. Building Manifests Programmatically While you can write manifests by hand, most dApps build them in code: - TypeScript — the Radix dApp Toolkit (/developers/frontend/01-radix-dapp-toolkit) sends manifests via sendTransaction - Rust — the ManifestBuilder (https://docs.radixdlt.com/docs/rust-manifest-builder) provides a fluent API - Console — the Developer Console (https://stokenet-console.radixdlt.com) has a raw transaction editor External Links - Complete instruction reference (https://docs.radixdlt.com/docs/manifest-instructions) - Manifest value syntax (/developers/legacy-docs/reference/sbor-serialization/manifest-sbor/manifest-value-syntax) - Simple token transfer example (https://docs.radixdlt.com/docs/simple-token-transfer) - Rust ManifestBuilder (https://docs.radixdlt.com/docs/rust-manifest-builder) ## 7. Multi-Component Architecture URL: https://radix.wiki/developers/scrypto/07-multi-component-architecture Updated: 2026-02-22 Summary: Design modular Scrypto dApps by composing small, reusable blueprints into larger systems. Multi-Component Architecture Pattern Type Application Architecture Difficulty Intermediate Concepts Blueprints & Packages (/contents/tech/core-concepts/blueprints-and-packages) , Composition Official Docs Scrypto Design Patterns (https://docs.radixdlt.com/docs/scrypto-design-patterns) Catalog Blueprint Catalog (https://www.radixdlt.com/blog/build-defi-dapps-faster-on-radix) Introduction Scrypto separates the concept of "smart contract" into two distinct things: blueprints (/contents/tech/core-concepts/blueprints-and-packages) (reusable templates) and components (live instances on the ledger). This separation enables a modular architecture where small, well-tested blueprints are composed into larger systems — much like how object-oriented programming uses classes and composition. The official Scrypto Design Patterns (https://docs.radixdlt.com/docs/scrypto-design-patterns) documentation recommends this approach as the default for non-trivial dApps. Separation of Concerns Instead of building a monolithic blueprint, decompose your dApp into small blueprints with a single responsibility: - TokenSale — handles pricing and token distribution - Treasury — manages collected funds, fee withdrawal - AccessControl — issues and manages badges (/developers/scrypto/03-authorization-and-badges) - PriceFeed — wraps an oracle (/developers/infrastructure/03-oracle-integration) and exposes prices to other components Each blueprint is easier to test in isolation, reason about for security, and audit independently. Smaller blueprints are also more reusable — a Treasury blueprint can serve a DEX, a lending protocol, or a DAO with no changes. Inter-Component Calls Components call each other's methods using their on-ledger address. A component can store another component's address in its state and invoke methods on it: struct MyDapp { treasury: Global, price_feed: Global, } impl MyDapp { pub fn buy(&mut self, payment: Bucket) -> Bucket { let price = self.price_feed.get_price("XRD/USD"); // ... calculate tokens owed ... self.treasury.deposit(payment); // ... return tokens ... } } Blueprint Reuse & the Catalog Radix encourages sharing blueprints as published packages that anyone can instantiate. The Blueprint Catalog (https://www.radixdlt.com/blog/build-defi-dapps-faster-on-radix) concept allows developers to find, audit, and reuse existing blueprints rather than reimplementing common patterns. Importing External Blueprints To use a blueprint from another package, import it by its package address and instantiate or call it. The Scrypto extern_blueprint! macro generates type-safe bindings: extern_blueprint! { "package_rdx...", // on-ledger package address PriceFeed { // blueprint name fn get_price(&self, pair: String) -> Decimal; } } This generates a type you can use in your component's state and method signatures, with full compile-time type checking. Factory Pattern A common pattern for dApps that create multiple similar components (e.g. a DEX creating liquidity pools): write a factory blueprint whose instantiation function creates and configures child components, returning their addresses and any admin badges. External Links - Scrypto Design Patterns — Official Docs (https://docs.radixdlt.com/docs/scrypto-design-patterns) - Blueprints and Components — Official Docs (https://docs.radixdlt.com/docs/blueprints-and-components) - Blueprint Catalog: Build DeFi dApps Faster — Radix Blog (https://www.radixdlt.com/blog/build-defi-dapps-faster-on-radix) - Reusable Blueprints Pattern — Official Docs (https://docs-babylon.radixdlt.com/main/scrypto/design-patterns/reusable-blueprint-pattern.html) ## 6. Vault and Resource Patterns URL: https://radix.wiki/developers/scrypto/06-vault-patterns Updated: 2026-02-22 Summary: Manage on-ledger resources safely using vaults, buckets, and Scrypto resource containers. Vault & Resource Patterns Pattern Type Resource Management Difficulty Beginner Concepts Buckets, Proofs & Vaults (/contents/tech/core-concepts/buckets-proofs-and-vaults) Official Docs Vaults and Buckets (https://docs.radixdlt.com/docs/buckets-and-vaults) Introduction Radix enforces a strict invariant: every resource (/contents/tech/core-concepts/asset-oriented-programming) must exist inside a container at all times. There are two container types: Vaults (permanent, on-ledger storage) and Buckets (temporary, transaction-scoped carriers). The Radix Engine (/contents/tech/core-protocols/radix-engine) guarantees that resources cannot be duplicated, accidentally destroyed, or left in limbo — if your transaction ends with an undeposited bucket, it fails. This "physical resource" model eliminates entire categories of bugs common in other smart contract platforms. Vault Patterns Single-Vault Component The simplest pattern: a component holds one vault to store a single resource type. A token sale component, for example, stores tokens in a vault and returns XRD payment to the buyer's account. struct TokenSale { token_vault: Vault, // holds tokens for sale xrd_vault: Vault, // collects XRD payments price: Decimal, } Multi-Vault Component (HashMap) When a component needs to manage arbitrary resource types (e.g. a DEX liquidity pool or a wallet-like component), use a HashMap. When a new resource type is deposited, create a new vault; when an existing type arrives, deposit into the matching vault. struct MultiVault { vaults: HashMap, } impl MultiVault { pub fn deposit(&mut self, bucket: Bucket) { let addr = bucket.resource_address(); self.vaults .entry(addr) .or_insert_with(|| Vault::new(addr)) .put(bucket); } } Vault Take & Put Withdraw resources from a vault with .take(amount) (returns a Bucket) or .take_all(). Deposit with .put(bucket). For non-fungibles, use .take_non_fungible(&id) or .take_non_fungibles(&ids) to withdraw specific items. Bucket Patterns Bucket Passing (Move Semantics) Buckets use Rust's ownership model — passing a bucket to a function moves it. The caller no longer has access. This prevents double-spending at the language level: // Caller creates a bucket, passes it to the component let payment: Bucket = account.withdraw(xrd_address, dec!("100")); let tokens: Bucket = sale_component.buy(payment); // 'payment' is now consumed — cannot be used again account.deposit(tokens); Bucket Splitting Use .take(amount) on a bucket to split it into two buckets — the original retains the remainder. This is how you implement partial fills, fee extraction, or distributing resources across multiple destinations within a single transaction. The Worktop In transaction manifests (/contents/tech/core-protocols/transaction-manifests) , returned resources land on the worktop — a temporary staging area. Use TAKE_FROM_WORKTOP to create named buckets from worktop resources, then pass them to subsequent method calls. The transaction fails if any resources remain on the worktop at the end. External Links - Vaults and Buckets — Official Docs (https://docs.radixdlt.com/docs/buckets-and-vaults) - Resources — Official Docs (https://docs.radixdlt.com/docs/resources) - Regulated Token Example — Official Docs (https://docs-babylon.radixdlt.com/main/scrypto/examples/regulated-token.html) ## 5. Testing Scrypto Blueprints URL: https://radix.wiki/developers/scrypto/05-testing-scrypto Updated: 2026-02-22 Summary: Create, build, and test Scrypto blueprints using the scrypto CLI, resim simulator, and the scrypto-test framework. Building & Testing Scrypto Packages Difficulty Beginner–Intermediate Est. Time 45–60 minutes Prerequisites Dev environment setup (/developers/getting-started/01-install-scrypto) Language Scrypto (/contents/tech/core-protocols/scrypto-programming-language) Key Tools scrypto, resim, scrypto-test Official Docs Scrypto Documentation (https://docs.radixdlt.com/docs/scrypto-1) Introduction Once your development environment (/developers/getting-started/01-install-scrypto) is set up, the next step is understanding the Scrypto development workflow: creating packages, writing blueprints (/contents/tech/core-concepts/blueprints-and-packages) , building to WebAssembly, and testing locally. Radix provides two complementary testing approaches — the resim simulator for interactive exploration and the scrypto-test framework for automated testing. Package Structure A Scrypto package is a standard Rust crate with Scrypto-specific dependencies. When you run scrypto new-package my-dapp, you get: my-dapp/ ├── Cargo.toml # Dependencies (scrypto crate) ├── src/ │ └── lib.rs # Blueprint definitions └── tests/ └── lib.rs # Test suite The src/lib.rs file contains one or more blueprints (/contents/tech/core-concepts/blueprints-and-packages) — reusable templates that define the structure and logic of on-ledger components. Each blueprint is annotated with the #[blueprint] macro and contains a struct (state) and an impl block (functions and methods). Building Compile your package to WebAssembly with: scrypto build This produces a .wasm binary and a .rpd (Radix Package Definition) file in the target/ directory. The .rpd contains the package's ABI — the blueprint names, functions, methods, and their type signatures — which the Radix Engine (/contents/tech/core-protocols/radix-engine) uses to validate calls at runtime. Interactive Testing with resim The Radix Engine Simulator (resim) is a local ledger emulator that lets you publish packages, instantiate components, and call methods without connecting to any network. It is invaluable for rapid iteration. Core Commands # Reset simulator state resim reset # Create a new account (returns address, public key, private key, owner badge) resim new-account # Set the active account resim set-default-account # Publish a package (returns package address) resim publish . # Call a blueprint function (e.g. instantiate a component) resim call-function [args...] # Call a method on an instantiated component resim call-method [args...] # Inspect an entity's state resim show
Typical Workflow - resim reset — start with a clean ledger - resim new-account — create a test account - resim publish . — deploy your package - resim call-function — instantiate a component from your blueprint - resim call-method — interact with the component - resim show — inspect state, balances, and vaults (/contents/tech/core-concepts/buckets-proofs-and-vaults) Automated Testing with scrypto-test While resim is great for exploration, production packages need automated tests. Radix provides two testing frameworks: Unit Testing: scrypto-test The scrypto-test (https://docs.radixdlt.com/docs/scrypto-test) framework uses an invocation-based approach — you call blueprint functions and methods directly in Rust, receiving actual Bucket (/contents/tech/core-concepts/buckets-proofs-and-vaults) and Proof objects that you can assert against. At its core is the TestEnvironment struct, which encapsulates a self-contained Radix Engine instance. use scrypto_test::prelude::*; #[test] fn test_instantiation() { let mut env = TestEnvironment::new(); let package = Package::compile_and_publish(".", &mut env).unwrap(); // Call functions, assert on returned buckets/proofs } Key utilities include BucketFactory and ProofFactory for creating test resources, with strategies like CreationStrategy::DisableAuthAndMint for bypassing auth in test contexts. Integration Testing: TestRunner The TestRunner (https://docs.radixdlt.com/docs/learning-to-test-a-multi-blueprint-package) is an in-memory ledger simulator where you interact as an external user submitting transactions. It applies all the same costing limits and auth checks as the real network, making it ideal for end-to-end testing. Run all tests with: scrypto test This wraps cargo test with the correct Scrypto feature flags and environment. External Links - Resim — Radix Engine Simulator (https://docs.radixdlt.com/docs/resim-radix-engine-simulator) - scrypto-test Framework — Official Docs (https://docs.radixdlt.com/docs/scrypto-test) - scrypto-test API Reference — docs.rs (https://docs.rs/scrypto-test/latest/scrypto_test/) - Testing Multi-Blueprint Packages — Official Docs (https://docs.radixdlt.com/docs/learning-to-test-a-multi-blueprint-package) ## 4. Events, Metadata, and Royalties URL: https://radix.wiki/developers/scrypto/04-events-metadata-royalties Updated: 2026-02-22 Summary: Emit events, set entity metadata, and configure developer royalties on your blueprints. Events, Metadata, and Royalties Difficulty Intermediate Est. Time 20 minutes Prerequisites Scrypto Fundamentals (/developers/scrypto/01-fundamentals) Language Scrypto (Rust) Official Docs Events (https://docs.radixdlt.com/docs/events) Events Components can emit events (https://docs.radixdlt.com/docs/events) during transaction execution. Events are encoded in SBOR (/contents/tech/core-concepts/sbor) and stored in the transaction receipt — useful for indexing, analytics, and off-chain reactions. #[derive(ScryptoSbor, ScryptoEvent)] pub struct SwapEvent { pub input_amount: Decimal, pub output_amount: Decimal, pub pool_address: ComponentAddress, } // Inside a method: Runtime::emit_event(SwapEvent { input_amount: dec!("100"), output_amount: dec!("95.5"), pool_address: self_address, });Events appear in Gateway API (/developers/infrastructure/02-radix-apis) responses and can be decoded using the Radix Engine Toolkit or the radix-event-stream library for real-time indexing. Entity Metadata Metadata (/developers/legacy-docs/reference/radix-engine/metadata/metadata) is a key-value store available on all entities — resources, components, packages, and accounts. The Radix Wallet and other clients use standardized keys (https://docs.radixdlt.com/docs/metadata-standards) to display assets consistently. Setting Metadata at Creation ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "My Token", locked; // Cannot be changed "symbol" => "MYT", locked; "description" => "A token", updatable; // Can be updated "icon_url" => Url::of("https://example.com/icon.png"), locked; } )) .mint_initial_supply(1000);Standard Metadata Keys Key Used By Purpose name Wallet, Dashboard Human-readable name symbol Wallet Short ticker (e.g., XRD) description Dashboard Brief description icon_url Wallet Token icon image info_url Dashboard Project website Keys are strings (max 100 chars). You can add custom keys beyond the standard set — clients will display them as-is. Royalties Radix has a built-in royalty system (https://docs.radixdlt.com/docs/royalties) that lets developers earn XRD whenever their blueprints or components are used. Royalties are collected automatically as part of transaction fees — users see the total cost upfront. Package Royalties Set royalties on specific functions or methods when publishing a package: #[blueprint] #[types(/* ... */)] mod my_dex { enable_package_royalties! { instantiate => Free; swap => Xrd(1.into()); // 1 XRD per swap call } // ... }Component Royalties Component instantiators can also set per-method royalties at instantiation time. The total royalty for a transaction is the sum of all package and component royalties for every method called. Pricing strategy You can specify royalties as a fixed XRD amount or an approximate USD equivalent. Use USD-equivalent pricing for stable fee expectations regardless of XRD price fluctuations. External Links - Events documentation (https://docs.radixdlt.com/docs/events) - Metadata documentation (https://docs.radixdlt.com/docs/metadata) - Metadata Standards (/developers/legacy-docs/reference/standards/metadata-standards/metadata-standards) - Royalties documentation (https://docs.radixdlt.com/docs/royalties) ## 3. Authorization and Access Rules URL: https://radix.wiki/developers/scrypto/03-authorization-and-badges Updated: 2026-02-22 Summary: Use badge resources as unforgeable credentials to gate method access on Scrypto components. Introduction In most smart contract platforms, access control is based on the caller's address — a pattern that leads to fragile permission systems and common exploits like reentrancy. Radix takes a fundamentally different approach: access is gated by badges, which are standard resources (/contents/tech/core-concepts/asset-oriented-programming) (fungible or non-fungible) that serve as unforgeable credentials. A caller is authorised not because of who they are but because of what they hold. This pattern is central to Scrypto development and appears in virtually every non-trivial dApp on Radix. How It Works Access Rules When a component is instantiated, its methods can be protected with access rules (/contents/tech/core-concepts/access-rules-and-auth-zones) that specify which badge(s) must be present for a call to succeed. The Radix Engine (/contents/tech/core-protocols/radix-engine) checks these rules automatically before executing any method — there is no manual require(msg.sender == owner) logic. enable_method_auth! { roles { admin => updatable_by: []; minter => updatable_by: [admin]; }, methods { mint_tokens => restrict_to: [minter, admin]; update_price => restrict_to: [admin]; buy => PUBLIC; } } Proofs and the Auth Zone When a method requires a badge, the caller provides a Proof (/contents/tech/core-concepts/buckets-proofs-and-vaults) — a cryptographic attestation that a resource exists in the caller's possession without transferring it. Proofs can be placed on the Auth Zone (a transaction-scoped container) so that multiple method calls within the same transaction can share the same authorisation context. # Transaction manifest: create proof and call restricted method CREATE_PROOF_FROM_ACCOUNT_OF_AMOUNT Address("account_rdx...") Address("resource_rdx...admin_badge...") Decimal("1") ; CALL_METHOD Address("component_rdx...") "update_price" Decimal("1.50") ; Common Badge Patterns Admin Badge The most basic pattern: mint a single non-fungible badge at instantiation and return it to the deployer. Methods like withdraw_fees, update_config, or pause are gated behind this badge. User Badge The User Badge Pattern (/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/user-badge-pattern) issues a non-fungible badge to each user when they register. The badge's non-fungible data stores user-specific state (balances, permissions, membership tier). Methods read the caller's badge data to personalise behaviour without maintaining a separate user registry. Multi-Signature Access rules support boolean logic: require_n_of(2, [badge_a, badge_b, badge_c]) creates a 2-of-3 multi-sig gate. This is useful for treasury management, protocol upgrades, or any high-stakes operation. Soulbound Badges By creating a non-transferable resource (restrict Deposit and Withdraw actions), a badge becomes soulbound to the original recipient's account. This is ideal for identity credentials, certificates, or membership tokens that should not change hands. External Links - Authorization — Proofs — Official Docs (https://docs.radixdlt.com/docs/auth) - User Badge Pattern — Official Docs (https://docs.radixdlt.com/docs/user-badge-pattern) - Account Deposit Patterns — Official Docs (https://docs.radixdlt.com/docs/account-deposit-patterns) Badge-Gated Authorization Pattern Type Access Control Difficulty Beginner–Intermediate Concepts Access Rules (/contents/tech/core-concepts/access-rules-and-auth-zones) , Proofs (/contents/tech/core-concepts/buckets-proofs-and-vaults) , Badges Official Docs Authorization — Proofs (https://docs.radixdlt.com/docs/auth) Example User Badge Pattern (/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/user-badge-pattern) ## 2. Resources, Vaults, and NFTs URL: https://radix.wiki/developers/scrypto/02-resources-and-nfts Updated: 2026-02-22 Summary: Create fungible and non-fungible resources, manage vaults and buckets, and use proofs. Resources, Vaults, and NFTs Difficulty Intermediate Est. Time 25 minutes Prerequisites Scrypto Fundamentals (/developers/scrypto/01-fundamentals) Language Scrypto (Rust) Official Docs Resources (/developers/legacy-docs/build/scrypto-1/resources/resources) Resources A resource (https://docs.radixdlt.com/docs/resources) is any digital asset on Radix — tokens, NFTs, badges, LP tokens, etc. Unlike Solidity where tokens are contract state, Radix resources are native engine primitives with built-in rules for minting, burning, transferring, and access control. There are two kinds: - Fungible — identical, interchangeable units (e.g., XRD, stablecoins) - Non-fungible — unique items with individual data (e.g., NFTs, access badges) Creating Fungible Resources let my_token = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "My Token", locked; "symbol" => "MYT", locked; } )) .mint_initial_supply(1_000_000) .into();This creates 1 million tokens with locked name and symbol. The OwnerRole::None means no one can change the resource's configuration after creation. Creating Non-Fungible Resources (NFTs) #[derive(ScryptoSbor, NonFungibleData)] pub struct Ticket { pub event_name: String, pub seat: u32, #[mutable] pub used: bool, } let tickets = ResourceBuilder::new_integer_non_fungible::(OwnerRole::None) .metadata(metadata!( init { "name" => "Event Ticket", locked; } )) .mint_initial_supply([ (IntegerNonFungibleLocalId::new(1), Ticket { event_name: "RadFest".into(), seat: 42, used: false, }), ]) .into();Fields marked #[mutable] can be updated later via the ResourceManager. Immutable fields are fixed at mint time. Vaults and Buckets Resources must always be inside a container: Container Lifetime Purpose Vault Permanent (on-ledger state) Store resources between transactions Bucket Transient (single transaction) Move resources during a transaction The Radix Engine enforces that all buckets must be empty by the end of every transaction. This guarantees no resources are accidentally lost or left in limbo. // Take 10 tokens from a vault into a bucket let bucket: Bucket = self.my_vault.take(10); // Put them into another vault self.other_vault.put(bucket);Typed variants (FungibleVault, NonFungibleVault, FungibleBucket, NonFungibleBucket) provide type-safe operations when you know the resource type at compile time. Proofs Proofs (https://docs.radixdlt.com/docs/auth) let you demonstrate ownership of a resource without transferring it — like showing an ID badge without handing it over. // Create a proof from a vault let proof = self.admin_badge.create_proof_of_all(); // The proof is automatically placed in the Auth Zone // where the Radix Engine checks it against access rulesProofs are central to Radix's authorization model (/developers/scrypto/03-authorization-and-badges) . Protected methods check for proofs in the caller's Auth Zone rather than checking msg.sender. ResourceManager Every resource has a ResourceManager — the on-ledger controller for that resource type. Use it to mint, burn, query supply, and update NFT data: // Mint more tokens (if resource was created with mint role) let new_tokens: Bucket = self.resource_manager.mint(500); // Get total supply let supply: Decimal = self.resource_manager.total_supply().unwrap(); // Update mutable NFT data self.resource_manager.update_non_fungible_data( &nft_id, "used", true, ); External Links - Resources documentation (https://docs.radixdlt.com/docs/resources) - Non-Fungible Tokens (https://docs.radixdlt.com/docs/non-fungible-tokens) - Proofs and Authorization (https://docs.radixdlt.com/docs/auth) - What are Vaults and Buckets? (https://learn.radixdlt.com/article/what-are-vaults-and-buckets) ## 1. Scrypto Fundamentals URL: https://radix.wiki/developers/scrypto/01-fundamentals Updated: 2026-02-22 Summary: Understand asset-oriented programming, blueprints, components, and packages on Radix. Scrypto Fundamentals Difficulty Beginner Est. Time 20 minutes Prerequisites Your First Blueprint (/developers/getting-started/02-first-blueprint) Language Scrypto (Rust) Official Docs Scrypto Overview (https://docs.radixdlt.com/docs/scrypto-overview) Asset-Oriented Programming Scrypto is built on asset-oriented programming (/contents/tech/core-concepts/asset-oriented-programming) — a paradigm where digital assets are first-class primitives managed by the Radix Engine (/contents/tech/core-concepts/radix-engine) , not arbitrary integers in a smart contract's storage. In Solidity, a token is a mapping of addresses to balances inside a contract. Transferring tokens means calling a function that modifies that mapping. Bugs in this logic cause real losses — reentrancy, integer overflow, and unauthorized access. In Scrypto, tokens are resources — objects with physical-like properties enforced by the engine. They cannot be duplicated, destroyed without authorization, or exist outside a container. The runtime guarantees that every resource is accounted for at the end of every transaction. Blueprints, Components, and Packages Scrypto's object model has three levels: Concept Analogy Description Package Library / crate Deployment unit containing one or more blueprints. Deployed once, referenced by address. Blueprint Class / template Defines state shape and methods. Contains no state itself — it's a template. Component Instance / object A live instantiation of a blueprint. Holds state, owns resources in vaults. A single blueprint can be instantiated many times — each component is independent with its own state and resource holdings. Code Structure use scrypto::prelude::*; #[blueprint] mod my_blueprint { struct MyBlueprint { // State fields — persisted between transactions my_vault: Vault, count: u64, } impl MyBlueprint { // Functions: called on the blueprint (no &self) // Used for instantiation pub fn instantiate() -> Global { Self { /* ... */ } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } // Methods: called on a component (&self or &mut self) pub fn get_count(&self) -> u64 { self.count } pub fn increment(&mut self) { self.count += 1; } } }The #[blueprint] macro handles serialization, state management, and ABI generation. You write plain Rust structs and methods. How Radix Engine Differs from EVM Aspect EVM (Solidity) Radix Engine (Scrypto) Assets Contract storage mappings First-class resources with engine-enforced rules Transfer Call a function that mutates state Move a bucket between vaults Authorization msg.sender checks Badge-based access rules (/developers/scrypto/03-authorization-and-badges) Reentrancy Must be guarded manually Impossible — resources move, not references Composability External calls with ABI encoding Native cross-component calls via transaction manifests (/developers/transactions/01-manifest-language) Scaling Single global state Shard-aware via Cerberus (/contents/tech/core-concepts/cerberus-consensus) Key Takeaways - Resources are real — they behave like physical objects, not database entries - Blueprints are templates — they define behavior but hold no state - Components are instances — each with independent state and resource vaults - The engine enforces safety — reentrancy, double-spend, and overflow bugs are prevented at the runtime level External Links - Scrypto Overview (https://docs.radixdlt.com/docs/scrypto-overview) - Asset-Oriented Programming (/contents/tech/core-concepts/asset-oriented-programming) - What is Radix Engine? (https://learn.radixdlt.com/article/what-is-radix-engine) - Components, Blueprints, and the Blueprint Catalog (https://learn.radixdlt.com/article/what-are-components-blueprints-and-the-blueprint-catalog) ## 3. Deploying to Stokenet and Mainnet URL: https://radix.wiki/developers/getting-started/03-deploying Updated: 2026-02-22 Summary: Deploy a Scrypto package to the Stokenet testnet or Mainnet via the Developer Console. Deploying to Stokenet & Mainnet Difficulty Beginner Est. Time 20 minutes Prerequisites Your First Blueprint (/developers/getting-started/02-first-blueprint) Language Scrypto (Rust) Stokenet Console stokenet-console.radixdlt.com (https://stokenet-console.radixdlt.com/deploy-package) Mainnet Console console.radixdlt.com (https://console.radixdlt.com/deploy-package) Overview Once you've tested locally with resim, the next step is deploying to the public Stokenet testnet. The process is the same for Mainnet — the only difference is you use real XRD. Stokenet vs Mainnet Always deploy and test on Stokenet (/contents/tech/releases/stokenet) first. Stokenet XRD is free and has no value. Never send Mainnet XRD to Stokenet addresses. 1. Get Test XRD Open the Radix Wallet (/contents/tech/core-protocols/radix-wallet) and switch to the Stokenet network. Use the built-in faucet to request free test XRD — you'll need it to pay transaction fees for deployment. 2. Build Your Package scrypto buildYour compiled WASM file is at: target/wasm32-unknown-unknown/release/.wasmThe build also generates an RPD (Radix Package Definition) file at the same location, which describes your blueprints and their interfaces. 3. Deploy via Developer Console - Go to stokenet-console.radixdlt.com/deploy-package (https://stokenet-console.radixdlt.com/deploy-package) - Click "Connect" in the top-right and connect your Radix Wallet - Upload your .wasm and .rpd files - Review the auto-generated transaction manifest (/developers/transactions/01-manifest-language) - Click "Send to Wallet" and approve the transaction in your wallet Once confirmed, the console displays your package address. Save it — you'll need it to instantiate components. 4. Instantiate a Component Go to the "Send Raw Transaction" section of the console and submit a manifest that calls your instantiation function: CALL_FUNCTION Address("") "GumballMachine" "instantiate" Decimal("5") ; CALL_METHOD Address("") "deposit_batch" Expression("ENTIRE_WORKTOP") ;The resulting component address is your live dApp on Stokenet. Deploying to Mainnet The process is identical — use console.radixdlt.com (https://console.radixdlt.com/deploy-package) instead and ensure your wallet is on Mainnet with real XRD for fees. Royalties Before deploying to Mainnet, consider setting package royalties (/developers/scrypto/04-events-metadata-royalties) to earn XRD whenever others use your blueprints. Next Steps - Scrypto Fundamentals (/developers/scrypto/01-fundamentals) — understand the asset-oriented programming model - Radix dApp Toolkit (/developers/frontend/01-radix-dapp-toolkit) — build a frontend that connects to your component - Transaction Manifest Language (/developers/transactions/01-manifest-language) — learn to write manifests directly External Links - Official deployment guide (https://docs.radixdlt.com/docs/deploy-a-package) - Stokenet Developer Console (https://stokenet-console.radixdlt.com) - Mainnet Developer Console (https://console.radixdlt.com) - What is Stokenet? (https://learn.radixdlt.com/article/what-is-stokenet) ## 2. Your First Blueprint URL: https://radix.wiki/developers/getting-started/02-first-blueprint Updated: 2026-02-22 Summary: Create a Gumball Machine blueprint, test it locally with resim, and learn Scrypto basics. Your First Blueprint Difficulty Beginner Est. Time 30 minutes Prerequisites Installing Scrypto (/developers/getting-started/01-install-scrypto) Language Scrypto (Rust) Official Docs Build a Gumball Machine (/developers/legacy-docs/build/learning-step-by-step/learning-to-build-a-gumball-machine) Overview In this tutorial you'll build a Gumball Machine — a simple component that sells gumball tokens for XRD. Along the way you'll learn the core Scrypto concepts: blueprints, components, resources, and vaults (/developers/scrypto/01-fundamentals) . Create a Package A package is the deployment unit in Scrypto — it contains one or more blueprints. scrypto new-package gumball_machine cd gumball_machineThis creates: gumball_machine/ ├── Cargo.toml # Dependencies ├── src/ │ └── lib.rs # Blueprint code └── tests/ └── lib.rs # Tests Write the Blueprint Replace src/lib.rs with the following. Each section is explained below. use scrypto::prelude::*; #[derive(ScryptoSbor)] pub struct Status { pub price: Decimal, pub amount: Decimal, } #[blueprint] mod gumball_machine { struct GumballMachine { gum_resource_manager: ResourceManager, gumballs: Vault, collected_xrd: Vault, price: Decimal, } impl GumballMachine { pub fn instantiate(price: Decimal) -> Global { let gumballs = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "Gumball", locked; "symbol" => "GUM", locked; } )) .mint_initial_supply(100) .into(); Self { gum_resource_manager: gumballs.resource_manager(), gumballs: Vault::with_bucket(gumballs), collected_xrd: Vault::new(XRD), price, } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } pub fn buy_gumball(&mut self, mut payment: Bucket) -> (Bucket, Bucket) { let cost = self.price.clone(); let our_share = payment.take(cost); self.collected_xrd.put(our_share); (self.gumballs.take(1), payment) } pub fn get_status(&self) -> Status { Status { price: self.price.clone(), amount: self.gumballs.amount(), } } } } Key Concepts - Blueprint — a template (like a class). The #[blueprint] macro exposes it to the Radix Engine. - Component — a live instance of a blueprint, created by instantiate(). - Resource — a first-class digital asset. ResourceBuilder creates a new fungible token ("Gumball"). - Vault — permanent on-ledger storage for resources. Components hold assets in vaults. - Bucket — a transient container for moving resources between vaults during a transaction. Notice there is no balance tracking or transfer logic — the Radix Engine handles asset management natively (/developers/scrypto/01-fundamentals) . Build scrypto buildThis compiles to WebAssembly at target/wasm32-unknown-unknown/release/gumball_machine.wasm. Test with resim resim is the Radix Engine Simulator (https://docs.radixdlt.com/docs/resim-radix-engine-simulator) — a local ledger you can use without connecting to any network. # Reset the simulator resim reset # Create a test account (saves address to $account) resim new-account # Publish the package resim publish . # Note the package address in the output # Instantiate a GumballMachine with price = 5 XRD resim call-function GumballMachine instantiate 5 # Note the component address in the output # Buy a gumball by sending 10 XRD resim call-method buy_gumball "10,resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3" # Check your account resim show You should see GUM tokens in your account and 5 XRD change returned. Handy resim commands resim show
— inspect any entity. resim new-token-fixed 1000 — create test tokens. resim reset — wipe and start fresh. Next Steps - Deploy to Stokenet (/developers/getting-started/03-deploying) — publish your package on the public testnet - Scrypto Fundamentals (/developers/scrypto/01-fundamentals) — deeper dive into the asset-oriented model - Testing Scrypto Blueprints (/developers/scrypto/05-testing-scrypto) — write proper unit and integration tests External Links - Official Gumball Machine tutorial (https://docs.radixdlt.com/docs/learning-to-build-a-gumball-machine) - resim documentation (https://docs.radixdlt.com/docs/resim-radix-engine-simulator) - Official Scrypto examples (https://github.com/radixdlt/official-examples) ## 1. Installing Scrypto URL: https://radix.wiki/developers/getting-started/01-install-scrypto Updated: 2026-02-22 Summary: Install Rust and the Scrypto CLI toolchain on macOS, Linux, or Windows. Installing Scrypto Difficulty Beginner Est. Time 15 minutes Prerequisites None Language Rust Official Docs Getting Rust & Scrypto (https://docs.radixdlt.com/docs/getting-rust-scrypto) Overview Scrypto (/developers/legacy-docs/updates/release-notes/scrypto/scrypto) is Radix's smart-contract language, built on Rust. You need three things: the Rust toolchain, the WebAssembly compilation target, and the Radix CLI tools. 1. Install Rust Install Rust via rustup (https://rustup.rs) . Scrypto 1.3.1 supports Rust 1.81.0 and newer (tested up to 1.92.0). [macOS / Linux] curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh source "$HOME/.cargo/env" [Windows] # Download and run the installer from https://rustup.rs # Choose "Proceed with standard installation" 2. Add the WASM Target Scrypto compiles blueprints to WebAssembly (https://webassembly.org/) for deterministic execution on the Radix Engine. rustup target add wasm32-unknown-unknown 3. Install Radix CLI Tools The radix-clis crate provides scrypto (compiler), resim (local simulator), and rtmc (manifest compiler). cargo install --force radix-clisBuild time First install compiles from source and may take several minutes. Subsequent installs are faster. 4. Verify rustc --version scrypto --version resim --versionAll three commands should print version numbers without errors. IDE Setup For the best experience, install VS Code (https://code.visualstudio.com/) with the rust-analyzer (https://marketplace.visualstudio.com/items?itemName=rust-lang.rust-analyzer) extension. This gives you inline type hints, autocompletion, and error checking for Scrypto code. Troubleshooting Missing WASM target If you see wasm32-unknown-unknown target not found, re-run: rustup target add wasm32-unknown-unknownOutdated toolchain If builds fail on a fresh install, update Rust: rustup update External Links - Official install guide (https://docs.radixdlt.com/docs/getting-rust-scrypto) - Scrypto GitHub repository (https://github.com/radixdlt/radixdlt-scrypto) - Scrypto 1.3.1 release blog (https://www.radixdlt.com/blog/scrypto-1-3-1-unlocking-modern-rust-support) ## hyperscale-rs URL: https://radix.wiki/contents/tech/research/hyperscale-rs Updated: 2026-02-19 Summary: Community-built Rust implementation of the Hyperscale consensus protocol, targeting a viable Xi'an network for Radix DLT. hyperscale-rs Type Consensus Protocol Implementation Language Rust (94.4%) Status Active Development Commits 520+ Lead flightofthefox (https://github.com/flightofthefox) ( proven.network (https://proven.network) ) GitHub hyperscalers/hyperscale-rs (https://github.com/hyperscalers/hyperscale-rs) Telegram t.me/hyperscale_rs (https://t.me/hyperscale_rs) (319 members) Stars / Forks 16 / 10 Contributors 4 Platforms Linux (x86_64), macOS (ARM64) Introduction hyperscale-rs is a community-built Rust implementation (https://github.com/hyperscalers/hyperscale-rs) of the Hyperscale consensus protocol for the Radix DLT ecosystem. The project's stated goal is to produce a viable Xi'an candidate (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) — the next-generation sharded consensus layer that will enable Radix to become what its community describes as "the world's first linearly scalable Layer 1 network." Led by flightofthefox of proven.network (https://proven.network) , the project was publicly announced (https://t.me/hyperscale_rs) with the opening of its source code and an invitation for community review and contribution. It is community-funded through a Radix donation address and represents an independent effort to improve upon the official reference implementation of Hyperscale. Background: Hyperscale & Xi'an Radix's long-term roadmap centres on achieving linear scalability — the ability to increase throughput proportionally by adding more shards to the network. The Hyperscale Alpha consensus mechanism (https://radixecosystem.com/news/hyperscale-alpha-part-i-the-inception-of-a-hybrid-consensus-mechanism-the-radix-blog-radix-dlt) (formerly known as Cassandra) represents Radix's approach to this problem, combining principles from Nakamoto consensus and classical Byzantine Fault Tolerant (BFT) protocols. In public testing, the Hyperscale protocol sustained over 500,000 transactions per second (https://getradix.com/updates/news/hyperscale-update-500k-public-test-done-the-radix-blog-radix-dlt) with peaks exceeding 700,000 TPS across more than 590 participating nodes. Private testing demonstrated linear scaling at roughly 250,000 TPS on 64 shards and maintained the same per-shard throughput at 500,000 TPS on 128 shards. The Xi'an production track (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) will implement this hybrid consensus mechanism into a production network candidate, with an Alpha release targeted for early 2027. A significant portion of the Xi'an work involves reimplementing the stack entirely in Rust, moving away from Java and the hybrid nature of the current Babylon network. Architecture hyperscale-rs is architected as a pure consensus layer (https://github.com/hyperscalers/hyperscale-rs) — deliberately containing no I/O, no locks, and no async code. This design decision enables deterministic simulation testing as a core design principle, allowing the entire consensus logic to be tested without non-determinism from network or disk operations. Consensus Mechanism The protocol uses a faster two-chain commit derived from HotStuff-2. The lead developer evaluated HotStuff-1 during the BFT module implementation but concluded that the theoretical latency improvement was not worth the added complexity — describing HotStuff-2 as being in the "Goldilocks zone." Key consensus features include: - Optimistic pipelining — proposers can submit new blocks immediately after quorum certificate (QC) formation, without waiting for the previous block to be fully committed - One-round finality — BFT provides finality with no possibility of reorganisation after QC - Cross-shard livelock prevention — enhanced mechanisms for detecting and resolving deadlocks across shards - Two-phase commit (https://pprogrammingg.github.io/web3_modules/hyperscale-rs/module-01b-tx-flow.html) for cross-shard atomicity, where a coordinator sends prepare messages, shards lock resources, then commit or abort Fault Model The system uses n = 3f+1 validators per shard, tolerating up to f Byzantine nodes, with a quorum requirement of 2f+1 (https://pprogrammingg.github.io/web3_modules/hyperscale-rs/module-01b-tx-flow.html) votes for QC formation. Radix Engine Integration Unlike the reference implementation, hyperscale-rs integrates with the real Radix Engine for smart contract execution, providing actual transaction processing rather than simulated execution. Crate Structure The project is organised as a modular system of 15+ Rust crates (https://github.com/hyperscalers/hyperscale-rs) , each handling a specific responsibility: Core - hyperscale-types — fundamental data structures: cryptographic hashes, blocks, votes, quorum certificates - hyperscale-core — trait-based architecture foundation and state machines - hyperscale-bft — Byzantine fault-tolerant consensus mechanics: block proposal, voting, view changes Execution - hyperscale-execution — transaction processing with two-phase commit coordination - hyperscale-mempool — transaction pool administration and shard-specific queuing - hyperscale-livelock — cross-shard deadlock detection and resolution Infrastructure - hyperscale-node — integrates all subsystems into a complete node - hyperscale-production — networking via libp2p and persistence via RocksDB Testing & Tooling - hyperscale-simulator — deterministic simulation testing framework with configurable network conditions - hyperscale-spammer — load generation utility for performance evaluation and benchmarking Transaction Flow The transaction lifecycle (https://pprogrammingg.github.io/web3_modules/hyperscale-rs/module-01b-tx-flow.html) follows a 14-step pipeline from user submission to finality, spanning three phases: Pre-Consensus (Steps 1–6) A user signs a transaction externally, which is submitted via an RPC gateway. The node receives the raw bytes, converts them to internal events, and performs cross-shard analysis to determine which NodeIDs (components, resources, packages, accounts) are touched. Transactions enter shard-specific mempools; cross-shard transactions are propagated to all involved shards via libp2p Gossipsub. BFT Consensus (Steps 7–11) Proposer selection occurs deterministically per round (e.g., round-robin by validator identity). The proposer builds a block from mempool transactions. Validators authenticate the block and broadcast votes. A quorum certificate is formed when 2f+1 votes are collected — notably, the QC is not sent as a separate message but is formed by the next proposer from collected votes. The block is committed once the commit rule is satisfied. Execution & Finality (Steps 12–14) Committed transactions are executed per shard using the Radix Engine. Cross-shard coordination uses a two-phase commit protocol where the coordinator sends prepare messages, shards lock resources without visible state changes, then commit or abort with state applied in an agreed order. BFT provides one-round finality with no possibility of reorganisation. Improvements Over Reference Implementation According to the project's pinned announcement (https://t.me/hyperscale_rs) , hyperscale-rs aims to improve upon the official Hyperscale reference implementation in several areas: - Better architected — modular crate structure replacing monolithic code - Better documented — improved code documentation and community-contributed learning materials - Better tested — deterministic simulation testing as a core design principle - More modular and maintainable — ongoing refactoring work to break down what was described as "kitchen drawer" crates into more focused, understandable modules - More closely aligned with production values — designed with real-world deployment in mind - More observable — enhanced monitoring and debugging capabilities - Significant upgrades to the consensus process — including the HotStuff-2 based approach - Safety violation fixes — addresses serious safety violations identified in the reference implementation - Real Radix Engine — uses the actual Radix Engine for smart contract execution rather than an alternative execution environment The lead developer notes that the implementation may currently be slower in practice due to the overhead of fixing safety bugs and using the more resource-intensive real Radix Engine, but states that an apples-to-apples comparison would show faster performance. Development & Community hyperscale-rs is developed openly with an active Telegram community (https://t.me/hyperscale_rs) of 319 members. The project's development is tracked through automated GitHub commit notifications in the channel via a Kit-Watcher bot. Key Contributors - flightofthefox ( proven.network (https://proven.network) ) — lead developer and channel admin - wizzl0r — channel owner, involved in PR reviews and testing infrastructure - Radical — code reviewer, working through the codebase and contributing documentation (https://pprogrammingg.github.io/web3_modules/hyperscale-rs/module-01b-tx-flow.html) on transaction flow Recent Development (February 2026) As of mid-February 2026, flightofthefox has resumed active development with a series of significant architectural commits: - Networking code extraction — extracting networking code from runners into dedicated crates (43 files, +2703 −2440), with a second clean-up pass planned after common runner logic is extracted - Simulation refactoring — removing parallel simulation in favour of a configurable dispatch mechanism in the main simulation crate (14 files, +1494 −1482) - Rayon dependency removal — ensuring par_iter is handled via dispatch trait only, removing the direct dependency on the rayon crate from the production build (8 files, +217 −324) The project continues to refactor runner crates to be more modular and accessible to newcomers, and has identified a need for contributors with experience in Terraform, Infrastructure-as-Code (IaC), and cloud infrastructure to help with end-to-end testing. Funding The project is community-funded through donations, including the FoxFund — a community funding initiative where supporters donate to sustain ongoing development. Community members have noted that donations directly correlate with development activity, with Peachy (RAC member) publicly donating to see continued commits. External Links - GitHub Repository — hyperscalers/hyperscale-rs (https://github.com/hyperscalers/hyperscale-rs) - Telegram Community — hyperscale-rs for Radix (https://t.me/hyperscale_rs) - Transaction Flow Documentation (https://pprogrammingg.github.io/web3_modules/hyperscale-rs/module-01b-tx-flow.html) - Radix Labs Roadmap — To Hyperscale and Beyond (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) - Hyperscale Alpha Part I — The Inception of a Hybrid Consensus Mechanism (https://radixecosystem.com/news/hyperscale-alpha-part-i-the-inception-of-a-hybrid-consensus-mechanism-the-radix-blog-radix-dlt) - Hyperscale Update — 500k+ Public Test Done (https://getradix.com/updates/news/hyperscale-update-500k-public-test-done-the-radix-blog-radix-dlt) ## DeXter URL: https://radix.wiki/ecosystem/dexter Updated: 2026-02-19 Summary: Community-driven order-book DEX on Radix built on the AlphaDEX Scrypto package, proposed for retirement in February 2026. DeXter Type Decentralised Exchange (DEX) Status Proposed Closure (Feb 2026) Founded 2023 Token DEXTR Backend AlphaDEX (/ecosystem/alphadex) (Scrypto) Key People Faraz ( Radstakes (https://radstakes.com) ), Fred Liebenberg (/ecosystem/alphadex) Telegram t.me/dexter_discussion (https://t.me/dexter_discussion) (556 members) Introduction DeXter (formerly ProjectX) is a community-driven, decentralised order-book exchange on the Radix network (/contents/tech/core-concepts/radix-network) . Built on the AlphaDEX (/ecosystem/alphadex) Scrypto package developed by Fred Liebenberg, DeXter aimed to bring traditional exchange features — limit and market orders, live price charts, order history — to the Radix DeFi ecosystem in a fully decentralised manner. The project was initiated following a Medium post by Fred Liebenberg (https://medium.com/alphadex-xrd/can-a-community-build-an-app-680cfbb06f1b) exploring whether a community could build a complete exchange application on Radix. The post attracted contributors including Faraz Abulhawa (founder of Radstakes (https://radstakes.com) ), who became a key figure in the project's governance and operations. AlphaDEX Backend DeXter's exchange functionality was powered by AlphaDEX (/ecosystem/alphadex) , a decentralised order-book package written in Scrypto (/contents/tech/core-concepts/scrypto) by Fred Liebenberg as part of the Radix Grants Programme Cohort 1 (https://www.radixdlt.com/blog/radix-grants-program-cohort-1-update-1) . AlphaDEX provided a three-layer architecture: - On-ledger Scrypto components — direct interaction with the order-book smart contract - REST API and WebSocket server — off-chain services routing requests from the front end - JavaScript SDK — enabling third-party integrations The platform offered non-custodial trading with a maximum fee of 0.5% that automatically decreased with higher volume. Up to 90% of fees were returned to liquidity providers whose orders were matched. Tokenomics & Staking Rewards DeXter issued the DEXTR token to incentivise contributors and participants. The token represented a share in the exchange's earnings, with periodic distributions to those who contributed to the project. The project also operated a Validator Staking Rewards programme, where users who staked XRD on the DeXter validator node earned DEXTR tokens. Rewards were uploaded fortnightly to a claims component where stakers could connect their wallets to claim. Proposed Closure (February 2026) On 18 February 2026, Faraz (Radstakes) proposed formally retiring DeXter (https://t.me/dexter_discussion/27547) , citing three factors: - Lack of exchange activity — trading volume had declined significantly - Validator subsidy tapering — the tapering schedule made continued validator operation unsustainable beyond March 2026 - Foundation transition uncertainty — the broader Radix Foundation decentralisation created an uncertain environment for funded ecosystem projects The proposal included immediately ceasing the Validator Staking Rewards programme (with final rewards already deposited), making remaining circulating DEXTR tokens available via a limit order on a DEX, and burning all remaining DeXter tokens in the treasury. Fred Liebenberg (AlphaDEX developer) agreed (https://t.me/dexter_discussion) the project should be put on hold, noting that Radix's capabilities had evolved significantly since AlphaDEX was built — particularly with the introduction of sub-intents (/contents/tech/core-concepts/transaction-manifest) — and that any future version would require a complete rebuild. He described this as a hibernation rather than a death, expressing hope the community aspect could take on other projects in the future. Running costs were described as negligible (annual domain fee, free hosting for the backend, ~$8/month API server). A community poll was opened with options to support closure, oppose it, or suggest alternatives. External Links - Telegram — DeXter Discussion (https://t.me/dexter_discussion) - Can a Community Build an App? — Medium (AlphaDEX) (https://medium.com/alphadex-xrd/can-a-community-build-an-app-680cfbb06f1b) - DeXter Introduction — YouTube (https://youtu.be/vMquiiG74a4) - AlphaDEX Overview — YouTube (https://youtu.be/NkhFqpefWlA) ## Radix Accountability Council URL: https://radix.wiki/community/radix-accountability-council Updated: 2026-02-19 Summary: Five-member community-elected council guiding the transition from the Radix Foundation to the Radix DLT DAO. Radix Accountability Council Type Governance Body Status Active (Season 1) — Handover Phase Established February 2026 Members Peachy, Faraz, Jazzer_9F, Avaunt, projectShift Council Size 5 (elected via approval voting (https://www.radixdlt.com/blog/rac-consultation-results) ) Election Turnout 1.34B XRD / 1,151 accounts Telegram t.me/RadixAccountabilityCouncil (https://t.me/RadixAccountabilityCouncil) (351 members) Consultations RadixTalk (https://www.radixtalk.com) Introduction The Radix Accountability Council (RAC) is a five-member community-elected body (https://www.radixdlt.com/blog/rac-consultation-results) tasked with guiding the transition from the Radix Foundation to the Radix DLT DAO (RDD). Established in February 2026 following a series of community consultations, the council serves as the primary interface between the Radix community and the Foundation during the decentralisation of governance. The RAC's charter defines three core responsibilities (https://t.me/RadixAccountabilityCouncil) : establishing the structures and processes needed to achieve the transition, executing community will as a multi-signer on RDD-associated structures, and representing RDD interests on matters of the Foundation transition. The council operates through an open Telegram channel (https://t.me/RadixAccountabilityCouncil) with 351 members, where RAC members post as admins and the broader community can observe proceedings. 2026 Strategy: The Next Chapter of Radix In February 2026, the Radix Foundation published its 2026 Strategy (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) outlining the transition from Foundation-led operations to community-driven governance. The strategy represents a fundamental shift in how Radix is managed and funded. Three-Phase Decentralisation The Foundation is implementing a phased approach to handing over control: - Phase 1 — Expanding the consultation dApp for community signalling on funding allocation, and requesting RFPs for critical services (gateway operations, code maintenance). - Phase 2 — Community-initiated RFPs for marketing, partnerships, and incentive programmes. - Phase 3 — Full on-chain governance control via DAO or multi-sig structures. Key Commitments The strategy FAQ (https://www.radixdlt.com/blog/2026-strategy-faq) clarifies several important points: the Foundation does not plan to sell XRD to cover operational costs, relying instead on existing fiat reserves. Critical infrastructure will not be discontinued without reliable alternatives in place. Radix trademarks and intellectual property will transfer to community-controlled entities (potentially DAOs or trusts) to prevent capture while preserving ecosystem access. Rather than repeating previous leadership's approach of traditional marketing with substantial funding, Radix is transitioning to community-determined priorities where members can submit their own proposals for marketing, code, or business development. Background: Foundation to DAO Transition Since its inception, the Radix Foundation has served as the custodial entity overseeing the development and operational infrastructure of the Radix network. As the network matured, a community consultation process (https://www.radixdlt.com/blog/establishing-the-radix-accountability-council) was initiated to transition governance authority from the Foundation to a community-run decentralised autonomous organisation — the Radix DLT DAO. The consultation to establish the RAC (https://www.radixdlt.com/blog/establishing-the-radix-accountability-council) recognised that a structured transition body was needed to bridge the gap between Foundation-led and community-led governance. Rather than an abrupt handover, the council model provides continuity through elected representatives who can negotiate with the Foundation, manage multi-signature authority over DAO structures, and ensure community interests are represented throughout the process. Election & Consultation Results RAC members were elected through an approval voting mechanism (https://www.radixdlt.com/blog/rac-consultation-results) where XRD holders could vote for up to five candidates from the pool of nominees. The consultation drew participation from 1,151 unique accounts representing approximately 1.34 billion XRD in voting weight. Council Size A separate consultation on council size resulted in five members winning with 56.91% of 1.30B XRD, over alternatives of three (18.94%), seven (16.23%), or nine (7.93%) members. Elected Members Member Approval % Peachy 92.76% Faraz | Radstakes 83.87% Jazzer_9F 82.92% Avaunt 58.44% projectShift 57.22% Adam, CSO of the Radix Foundation, participates in the RAC Telegram channel (https://t.me/RadixAccountabilityCouncil) as a Foundation representative but is not a council member. Responsibilities The RAC's pinned charter (https://t.me/RadixAccountabilityCouncil) defines three core areas of responsibility: - Establish transition structures — Create and refine the legal, organisational, and governance structures needed for the Radix Foundation to hand over operational control to the Radix DLT DAO. - Execute community will — Act as multi-signers on RDD-associated structures, ensuring that decisions made through community consultations are faithfully executed on-chain and off-chain. - Represent RDD interests — Serve as the community's voice in negotiations with the Radix Foundation on matters relating to the transition, including asset transfers, intellectual property, and operational handovers. Foundation Operational Stack As part of the transition planning, the Radix Foundation published its Operational Stack (https://www.radixdlt.com/blog/the-foundation-operational-stack-mapping-the-2026-transition) — a prioritised inventory of all services and infrastructure currently maintained by the Foundation that will need to be transferred to or replicated by the DAO. Priority 1 — Core Network Access & Resilience High-availability services essential for network interaction, requiring immediate continuity planning via an RFP process: - Babylon Gateway Endpoint — primary API for ledger indexing and transaction submission - Radix Connect Relay — signalling service enabling secure wallet-to-dApp connections - Signaling Server — enables P2P connections between wallet and desktop extension Priority 2 — High-Use & Product Maintenance Stable tools requiring active maintenance but not vital for network survival: - Core software: Node Software, Core Code Repos, Wallet (iOS/Android) - Infrastructure: Radix Connector Extension, Turn Server, Ledger App - Web properties: radixdlt.com, social channels Priority 3 — Non-Critical but Maintained Experience-improving services offering commercialisation opportunities: asset service, image service, price feeds, Dashboard, RadQuest, Gumball Club, dApp Toolkit, Dev Console, documentation sites. As of February 2026, the community is being asked to provide input (https://radixtalk.com/t/action-on-p3-foundation-services/2238) on the desired handover, open-sourcing, or shutdown of each P3 service. Active Handover Phase As of mid-February 2026, Adam (CSO Radix Foundation) announced (https://t.me/RadixAccountabilityCouncil/405) that the RAC has entered the active handover phase. Key activities include: - Entity formation — Sorting out a legal entity (e.g. Wyoming DUNA) is identified as the main blocker, as many assets and services require a legal entity to transfer to. - P3 service assessment — The Foundation initiated community input on P3 services via RadixTalk discussion (https://radixtalk.com/t/action-on-p3-foundation-services/2238) , with the RAC expected to coordinate the process going forward. - Custody questions — RAC members are discussing how to serve as custodians-under-trust for assets (social accounts, GitHub, Telegram channels) until a formal entity exists. - Consultation V2 — projectShift is working on running the consultation dApp autonomously, independent of Foundation infrastructure. Active Consultations As of February 2026, the RAC is facilitating several consultations through RadixTalk (https://www.radixtalk.com) : Season 2 of Radix Rewards A draft consultation (https://www.radixtalk.com/consultation/01JMCQ7SAX2B0Y72SHV1YPEWQP) on the continuation and structure of the Radix Rewards programme, which incentivises network participation and ecosystem development. DAO Entity Location A draft consultation (https://www.radixtalk.com/consultation/01JMCQKMZ3MPNZWWSXRG6FD4XR) on the legal jurisdiction and entity type for the Radix DLT DAO. Options under consideration include a Wyoming Decentralised Unincorporated Nonprofit Association (WY DUNA), Cayman Islands Foundation, Marshall Islands DAO LLC, Abu Dhabi entity, or other jurisdictions. Instabridge IP Acquisition A question raised in the RAC channel (https://t.me/RadixAccountabilityCouncil) regarding the Foundation's acquisition of intellectual property from Instabridge, seeking clarity on the terms and relevance to the DAO transition. External Links - Telegram — Radix Accountability Council (https://t.me/RadixAccountabilityCouncil) - 2026 Strategy — The Next Chapter of Radix (https://www.radixdlt.com/blog/2026-strategy-the-next-chapter-of-radix) - 2026 Strategy FAQ — Radix Blog (https://www.radixdlt.com/blog/2026-strategy-faq) - The Foundation Operational Stack: Mapping the 2026 Transition (https://www.radixdlt.com/blog/the-foundation-operational-stack-mapping-the-2026-transition) - RAC Consultation Results — Radix Blog (https://www.radixdlt.com/blog/rac-consultation-results) - Establishing the RAC — Radix Blog (https://www.radixdlt.com/blog/establishing-the-radix-accountability-council) - RadixTalk — Community Consultation Platform (https://www.radixtalk.com) ## Consensus Evolution at Radix URL: https://radix.wiki/contents/tech/research/consensus-evolution Updated: 2026-02-19 Summary: How Radix iterated through six consensus designs — from eMunie to Cerberus — over a decade of distributed ledger research. Consensus Evolution at Radix Founder Dan Hughes (/community/dan-hughes) Research Period 2013–2023 Iterations 6 (eMunie → Blocktrees → DAG → CAST → Tempo → Cerberus) Current Protocol Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Academic Validation JSys peer-reviewed paper (2023) (/contents/tech/research/cerberus-whitepaper) Key Constraint Atomic composability + linear scalability Introduction Radix's consensus protocol is the result of over a decade of iterative research by founder Dan Hughes (/community/dan-hughes) . Between 2013 and 2020, Hughes designed, built, and discarded five distinct consensus architectures — each solving problems revealed by its predecessor — before arriving at Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , the protocol that powers the Radix mainnet today. This article traces that research journey, examining what each iteration contributed and why it was ultimately superseded. eMunie (2013–2015) Hughes' distributed ledger work began with eMunie (/contents/tech/research/emunie) , a project launched in March 2013 — before Ethereum's whitepaper was published. eMunie was conceived as a platform for a decentralised stablecoin with an autonomous supply model that could expand and contract to maintain purchasing-power parity. The project used Delegated Proof of Stake and included features that were ahead of their time: an integrated naming system (predating ENS), encrypted messaging, a decentralised marketplace, and an asset creation system. eMunie demonstrated that a broad-featured decentralised platform was technically feasible, but its consensus design was not built for the scale required by global finance. Hughes recognised that the underlying data structures — not just the consensus rules — needed rethinking. Blocktrees & DAG Experiments (2015–2016) Hughes explored alternative data structures to move beyond linear blockchain limitations. Blocktrees extended the blockchain into a tree structure where branches could process transactions independently, merging periodically. Directed Acyclic Graphs (DAGs) — a structure later popularised by IOTA — allowed transactions to reference multiple predecessors, enabling parallel validation. Both approaches improved throughput over single-chain designs but introduced new problems. Blocktrees required complex merge logic and struggled with conflict resolution across branches. DAGs, while elegant for simple transfers, made it difficult to guarantee deterministic ordering for smart contract execution — a prerequisite for composable DeFi. CAST (2016–2017) Channeled Asynchronous State Trees (CAST) was Hughes' attempt to combine the best aspects of blocktrees and DAGs with a state-sharding model. CAST partitioned the ledger into channels that could process transactions asynchronously, with a tree structure for state management. This design moved closer to the sharded architecture that would eventually characterise Radix, but the asynchronous coordination model introduced edge cases around cross-channel atomicity that proved difficult to resolve formally. Tempo (2017–2019) Tempo (/contents/tech/research/tempo-consensus-mechanism) was the fifth iteration and the first to achieve public recognition. Proposed in September 2017, it introduced two innovations that would carry forward into Cerberus: - Pre-sharding — the ledger was partitioned into 264 (18.4 quintillion) fixed shards from genesis, so the shard space never needed to be reorganised as the network grew. - Lazy consensus — nodes only communicated when necessary to resolve conflicts, dramatically reducing coordination overhead for non-conflicting transactions. Tempo used logical clocks and temporal proofs to establish ordering between events across shards. In June 2019, Radix conducted a public test (https://www.radixdlt.com/blog) replaying 10 years of Bitcoin transactions onto a Tempo network, processing them in under 30 minutes and achieving over 1 million TPS. However, Tempo had critical weaknesses: it lacked absolute transaction finality (relying on probabilistic guarantees), and analysis revealed the Weak Atom Problem — a class of cross-shard attacks where a malicious client could exploit the lazy consensus model to create conflicting transactions on different shards. Tempo also lacked robust Sybil protection for its validator selection mechanism. Cerberus (2020–Present) Cerberus, the sixth iteration, was published in August 2020 (https://arxiv.org/abs/2008.04450) and directly addressed every weakness identified in Tempo. Its key design decisions were shaped by a decade of lessons: - From lazy consensus to BFT — Cerberus replaced Tempo's probabilistic lazy consensus with deterministic BFT consensus (derived from HotStuff (https://arxiv.org/abs/1803.05069) ), providing absolute finality on every transaction. - Braided consensus for atomicity — rather than relying on after-the-fact conflict detection, Cerberus braids independent BFT instances across shards into a single atomic commitment. This eliminates the Weak Atom Problem by design. - Pre-sharding preserved — the practically unlimited shard space from Tempo was retained, maintaining linear scalability as validators are added. - Formal validator sets — each shard has a defined validator set with stake-weighted selection, providing Sybil resistance that Tempo lacked. The protocol was peer-reviewed and validated (/contents/tech/research/cerberus-whitepaper) through a collaboration with UC Davis' ExpoLab. Testing demonstrated over 10 million TPS on 1,024 shards with linear throughput scaling. Ongoing Research Post-Cerberus research continues through projects like Cassandra (/contents/tech/research/cassandra) (investigating liveness guarantees and dynamic validator sets) and hyperscale-rs (/contents/tech/research/hyperscale-rs) (a community-built Rust implementation exploring the next-generation Xi'an release). Dan Hughes has stated that the Xi'an upgrade will bring further improvements to shard-level liveness, validator rotation, and throughput — building on Cerberus rather than replacing it. External Links - Dan Hughes on the Radix Blog (https://www.radixdlt.com/blog-author/dan-hughes) - Cerberus paper (arXiv) (https://arxiv.org/abs/2008.04450) - Tempo Whitepaper (PDF) (https://daks2k3a4ib2z.cloudfront.net/59b6f7652473ae000158679b/59ca573e4873510001375b15_RadixDLT-Whitepaper-v1.1.pdf) - Cerberus Infographic Series (https://www.radixdlt.com/blog/cerberus-infographic-series-chapter-i) ## Radix Economic Model URL: https://radix.wiki/contents/tech/research/radix-economic-model Updated: 2026-02-19 Summary: The economic design of the Radix network: XRD emissions, fee burning, DPoS incentive alignment, and stablecoin reserve governance. Radix Economic Model Native Asset $XRD (/contents/tech/core-protocols/xrd-token) Max Supply 24 billion XRD Genesis Allocation 12 billion (July 2021) Network Emissions ~300 million XRD/year over ~40 years Fee Burn Rate 50% of base transaction fees Consensus Delegated Proof of Stake (/contents/tech/core-protocols/cerberus-consensus-protocol) Active Validator Set Top 100 by stake Introduction The Radix economic model is designed to align network security incentives with long-term ecosystem growth. Unlike many layer-1 protocols that rely primarily on transaction fees or fixed block rewards, Radix uses a combination of programmatic XRD emissions (/contents/tech/core-concepts/network-emissions) , fee burning, and delegated proof of stake (/contents/tech/core-concepts/validator-nodes) to create a self-sustaining economic loop. The model was outlined in the Radix Economics Proposal v2 (https://medium.com/@radixdlt/radix-economics-proposal-v2-ffcb14060594) and refined through community governance. Token Supply & Allocation XRD has a hard cap of 24 billion tokens. Half were minted at network genesis in July 2021; the other half is emitted over approximately 40 years as staking rewards. Genesis Allocation (12 Billion XRD) The initial distribution was structured to fund development, reward early participants, and reserve capital for ecosystem initiatives: Category Amount Share Radix Community (early contributors, 2013–2017) 3,000M 12.5% RDX Works Ltd (founder retention) 2,400M 10.0% Stablecoin Reserve (locked) 2,400M 10.0% Radix Tokens (Jersey) Limited 2,158M 9.0% October 2020 Token Sale 642M 2.7% Developer Incentives 600M 2.5% Network Subsidy (validator grants) 600M 2.5% Liquidity Incentives 200M 0.8% All genesis tokens except the Stablecoin Reserve were fully unlocked at launch. The allocation details (https://learn.radixdlt.com/article/how-was-the-xrd-token-allocated) are published by the Radix Foundation. Emission Mechanics The remaining 12 billion XRD is minted by the protocol as network emissions (/contents/tech/core-concepts/network-emissions) , distributed to validators and their delegators at approximately 300 million XRD per year. Emissions serve two purposes: compensating validators for securing the network, and gradually broadening token distribution over time. Emission rates are not fixed in perpetuity — the XRD protocol parameters (/contents/tech/core-protocols/xrd-token) include adjustment mechanisms tied to epoch duration and network utilisation metrics. This allows the emission curve to adapt if network conditions change significantly, though changes require protocol-level updates and community signalling. Delegated Proof of Stake XRD holders delegate stake to validator nodes (/contents/tech/core-concepts/validator-nodes) to participate in consensus. The top 100 validators by total delegated stake form the active set and are eligible to receive emissions. Validators set a commission percentage (deducted from their delegators' rewards), creating a competitive market for staking services. Delegators receive Liquid Stake Units (LSUs) (/contents/tech/core-concepts/liquid-stake-units) — fungible tokens representing their staked position — which can be freely transferred or used in DeFi while the underlying XRD continues earning rewards. Fee Burning & Deflationary Pressure Every transaction on Radix incurs a fee denominated in XRD. The protocol permanently burns 50% of the base network fee, removing those tokens from circulation forever. The remaining 50% is distributed to validators as additional compensation on top of emissions. This burn mechanism introduces deflationary pressure that partially offsets ongoing emissions. As network usage grows and total transaction volume increases, the burn rate rises proportionally. In a mature network with high transaction throughput, the annual burn could approach or exceed annual emissions — at which point XRD becomes net-deflationary, similar in principle to Ethereum's EIP-1559 (https://eips.ethereum.org/EIPS/eip-1559) mechanism but applied natively from launch rather than retrofitted. Stablecoin Reserve Governance Of the 12 billion genesis tokens, 2.4 billion XRD (10% of max supply) were locked in a Stablecoin Reserve — earmarked to bootstrap decentralised stablecoin projects native to Radix. The Radix Foundation has a 10-year window (from July 2021) to disburse these tokens. In 2025, the Radix Foundation opened a community consultation (https://bitrss.com/radix-foundation-opens-consultation-on-repurposing-2-4-billion-xrd-stablecoin-reserve-86800) on repurposing the reserve toward broader ecosystem growth. The consultation received overwhelming support (91% of weighted votes) for reallocating 1 billion XRD to a multi-season incentives campaign aimed at boosting on-chain liquidity and economic activity. Incentive Alignment The economic model is designed so that all participants' incentives converge on network health: - Validators earn emissions and fee shares, incentivising uptime and honest behaviour. Validators with poor performance lose delegators and fall out of the top 100. - Delegators earn staking yield via LSUs while contributing to network security. The liquid nature of LSUs means capital is not locked away, reducing the opportunity cost of staking. - Developers can set component royalties (/contents/tech/core-concepts/component-royalties) on their blueprints, earning XRD each time their code is called. This creates a sustainable revenue model for open-source DeFi development. - Users pay transaction fees that are low enough for practical use but high enough to prevent spam. Fee burning ensures that network usage benefits all XRD holders through supply reduction. External Links - Radix Economics Proposal v2 (https://medium.com/@radixdlt/radix-economics-proposal-v2-ffcb14060594) - Radix Tokens and Tokenomics — Knowledge Base (https://learn.radixdlt.com/article/start-here-radix-tokens-and-tokenomics) - XRD Token Allocation — Knowledge Base (https://learn.radixdlt.com/article/how-was-the-xrd-token-allocated) - Radix on Tokenomics Hub (https://tokenomicshub.xyz/radix) ## Cerberus Whitepaper & Academic Validation URL: https://radix.wiki/contents/tech/research/cerberus-whitepaper Updated: 2026-02-19 Summary: The Cerberus academic paper, its peer review at JSys, and the UC Davis collaboration that formally validated Radix consensus. Cerberus Whitepaper Full Title Cerberus: Minimalistic Multi-shard Byzantine-resilient Transaction Processing Authors Florian Cäsar, Daniel P. Hughes, Joshua Primero, Mohammad Sadoghi First Published August 2020 ( arXiv (https://arxiv.org/abs/2008.04450) ) Peer Review Journal of Systems Research (JSys) (https://www.jsys.org/) , 2023 Academic Partner UC Davis ExpoLab (https://expolab.org/) Subject Distributed Consensus, Sharding, BFT Related Cerberus Protocol (/contents/tech/core-protocols/cerberus-consensus-protocol) Introduction The Cerberus whitepaper, first published on arXiv in August 2020 (https://arxiv.org/abs/2008.04450) , presents a consensus protocol designed to achieve parallelised transaction processing across an effectively unlimited number of shards while preserving atomic composability (/contents/tech/core-concepts/atomic-composability) . The paper was authored by Florian Cäsar, Daniel P. Hughes (https://www.radixdlt.com/blog-author/dan-hughes) (founder of Radix), Joshua Primero, and Professor Mohammad Sadoghi (https://expolab.org/team.html) of the University of California, Davis. Unlike most blockchain whitepapers that remain unreviewed, Cerberus underwent rigorous academic scrutiny and was accepted into the Journal of Systems Research (JSys) (https://www.jsys.org/) in 2023 after a full peer-review process. This places Radix among a small number of public ledger projects whose core consensus mechanism has been validated to the highest academic standards. Paper Overview The paper addresses a fundamental tension in distributed ledger design: how to shard a network for scalability without sacrificing the ability to atomically compose transactions across shards. The authors argue that existing sharded protocols either impose global ordering (limiting throughput) or sacrifice atomicity (breaking composability). Problem Statement Traditional blockchains process transactions sequentially on a single chain, creating a throughput bottleneck. Sharding (/contents/tech/core-concepts/sharding) partitions the ledger across independent groups of validators, enabling parallel processing. However, when a transaction touches data on multiple shards, the shards must coordinate — and prior approaches either relied on expensive cross-shard locking mechanisms, required global consensus, or could not guarantee atomicity. Three Protocol Variants Cerberus is presented in three variants of increasing robustness: - Core-Cerberus — operates under strict environmental assumptions with well-behaved clients, requiring no additional coordination beyond single-shard BFT consensus. Each shard independently decides to commit or abort, and the UTXO-based state model prevents double-spending without global ordering. - Optimistic-Cerberus — avoids extra coordination phases during normal operation but includes recovery mechanisms for Byzantine behaviour, accepting higher costs only when attacks are detected. - Pessimistic-Cerberus — adds proactive coordination phases that allow operation in fully adversarial environments, trading some latency for consistent safety guarantees regardless of client behaviour. UC Davis Collaboration In 2020, Radix partnered with UC Davis' ExpoLab (https://www.radixdlt.com/blog/radix-partners-with-top-us-research-lab-to-bring-its-new-cerberus-consensus-to-life) , led by Professor Mohammad Sadoghi, to provide independent academic validation of Cerberus. The ExpoLab team included postdoctoral fellow Jelle Hellings, and PhD candidates Suyash Gupta and Sajjad Rahnama — researchers with deep expertise in Byzantine fault-tolerant systems. The collaboration focused on four areas: - Formal mathematical proofs — creating rigorous proofs of safety and liveness for each protocol variant, going beyond the original whitepaper's informal arguments. - Security analysis — identifying potential attack vectors (equivocation, cross-shard deadlocks, validator collusion) and verifying that the protocol's mitigations hold under adversarial conditions. - Implementation testing — deploying Cerberus on ExpoLab's ExpoDB platform (https://expolab.org/) to benchmark real-world performance, latency, and throughput characteristics. - Comparative analysis — benchmarking Cerberus against other sharded BFT protocols (AHL, ByShard, Caper) on scalability, liveness, and safety metrics. The resulting peer-reviewed evaluation, published in JSys, confirmed that Cerberus achieves linear throughput scaling with the number of shards while maintaining atomic cross-shard commitment — a combination no prior protocol had demonstrated under formal analysis. Key Technical Contributions The paper makes several contributions to distributed systems research: - Braided consensus — Cerberus "braids" independent single-shard BFT consensus instances (3-chains) into an emergent multi-shard consensus (a 3-braid). Each shard runs its own HotStuff-derived BFT instance; when a transaction spans multiple shards, their consensus processes are temporarily linked to reach a joint commit-or-abort decision. - Minimised coordination — cross-shard commitment requires only a single additional consensus step per involved shard, with no global ordering or leader election. Shards that are not involved in a given transaction are unaffected. - UTXO-based conflict prevention — by adopting a substate (/contents/tech/core-concepts/substate-model) (UTXO-like) model, data must be consumed and recreated to be modified, preventing concurrent modification conflicts without cross-shard locks. - Cluster-send primitive — a communication primitive that prevents equivocation by ensuring a validator cannot send conflicting messages to different shards within the same transaction. External Links - Cerberus paper on arXiv (https://arxiv.org/abs/2008.04450) - Cerberus Whitepaper v1.01 (PDF) (https://assets.website-files.com/6053f7fca5bf627283b582c2/608811e3f5d21f235392fee1_Cerberus-Whitepaper-v1.01.pdf) - Radix–UC Davis partnership announcement (https://www.radixdlt.com/blog/radix-partners-with-top-us-research-lab-to-bring-its-new-cerberus-consensus-to-life) - Cerberus Infographic Series (https://www.radixdlt.com/blog/cerberus-infographic-series-chapter-i) - Journal of Systems Research (https://www.jsys.org/) ## Cerberus vs Other BFT Protocols URL: https://radix.wiki/contents/tech/comparisons/cerberus-vs-other-bft-protocols Updated: 2026-02-19 Summary: Comparison of Cerberus with PBFT, HotStuff, Tendermint, and Narwhal/Bullshark on shard support, atomicity, throughput, and finality. Cerberus vs Other BFT Protocols Category Consensus Protocol Comparison Cerberus Parallelised BFT with braiding (/contents/tech/core-protocols/cerberus-consensus-protocol) Peer Review arXiv:2008.04450 (https://arxiv.org/pdf/2008.04450) (2020, JSR 2023) BFT Threshold < 1/3 Byzantine per shard Introduction Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) is Radix's consensus protocol, published as a peer-reviewed paper (https://arxiv.org/pdf/2008.04450) in 2020 and reviewed in the Journal of Systems Research (2023). Independent reviewers concluded that Cerberus is among the most efficient multi-shard consensus protocols in terms of throughput and latency. This article compares it against established BFT (/contents/tech/core-concepts/byzantine-fault-tolerance) protocols. Single-Pipeline Protocols PBFT (Practical Byzantine Fault Tolerance) PBFT (https://en.wikipedia.org/wiki/Practical_Byzantine_fault_tolerance) (Castro & Liskov, 1999) is the foundational BFT protocol. It runs a three-phase commit process (pre-prepare, prepare, commit) requiring O(n²) message complexity. PBFT provides deterministic finality but is limited to small validator sets (~20-50) due to communication overhead. All transactions are processed sequentially through a single leader. HotStuff HotStuff (https://arxiv.org/abs/1803.05069) (2018) reduces PBFT's message complexity to O(n) per round through a pipelining approach with rotating leaders. It forms the basis for Meta's abandoned Diem/Libra and the Aptos blockchain. HotStuff is more scalable than PBFT for larger validator sets but remains a single-pipeline protocol — all transactions are totally ordered. CometBFT (Tendermint) CometBFT (https://docs.cometbft.com/) (formerly Tendermint) is a BFT protocol used across the Cosmos ecosystem. It provides instant finality (no probabilistic confirmation) with O(n²) communication complexity. Each Cosmos chain runs its own CometBFT instance, enabling the multi-chain architecture but limiting individual chain throughput. DAG-Based Protocols Narwhal/Bullshark Narwhal (https://arxiv.org/abs/2105.11827) (2021) separates data dissemination from consensus ordering using a DAG (Directed Acyclic Graph) structure. Bullshark (https://arxiv.org/abs/2201.05677) (2022) adds partially synchronous BFT ordering on top. This combination achieves high throughput by parallelising data availability but still requires total ordering of all transactions — a single-shard model. Narwhal/Bullshark formed the basis of Sui's consensus layer. Cerberus: Parallelised BFT Cerberus takes a fundamentally different approach. Rather than optimising a single consensus pipeline, it runs a proven BFT process in parallel across practically unlimited shards, each maintaining its own ordered transaction stream. Cross-shard transactions are handled through "braiding" — a synchronisation mechanism that achieves atomic commitment across involved shards without global coordination. Property PBFT HotStuff CometBFT Narwhal/Bullshark Cerberus Shard Support Single Single Single (per chain) Single Unlimited Cross-Shard Atomicity N/A N/A Via IBC (async) N/A Native (braided) Message Complexity O(n²) O(n) O(n²) O(n) O(n) per shard Finality Deterministic Deterministic Deterministic Deterministic Deterministic Throughput Scaling Fixed ceiling Fixed ceiling Fixed per chain Fixed ceiling Linear with nodes The key innovation is that Cerberus provides atomic composability (/contents/tech/core-concepts/atomic-composability) across shards — something no other production BFT protocol achieves — while maintaining linear throughput scaling. External Links - Cerberus: A Parallelized BFT Consensus Protocol (https://arxiv.org/pdf/2008.04450) — arXiv - What is Cerberus? (https://learn.radixdlt.com/article/what-is-cerberus) — Radix Knowledge Base - PBFT (https://en.wikipedia.org/wiki/Practical_Byzantine_fault_tolerance) — Wikipedia - HotStuff: BFT Consensus in the Lens of Blockchain (https://arxiv.org/abs/1803.05069) — arXiv ## Radix vs Polkadot URL: https://radix.wiki/contents/tech/comparisons/radix-vs-polkadot Updated: 2026-02-19 Summary: Technical comparison of Radix's sharded architecture and Polkadot's relay chain plus parachain model for scalability and interoperability. Radix vs Polkadot Category Layer-1 Comparison Radix Model Single sharded network Polkadot Model Relay chain + parachains Radix Consensus Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Polkadot Consensus BABE + GRANDPA (https://wiki.polkadot.network/docs/learn-consensus) Introduction Polkadot (https://polkadot.network/) and Radix both address blockchain scalability through parallelism, but via different architectures. Polkadot uses a relay chain that coordinates specialised parachains, while Radix uses a single sharded network with Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) consensus. Both were designed from scratch rather than forked from existing codebases. Scalability Architecture Polkadot: Relay Chain + Parachains Polkadot's relay chain (https://wiki.polkadot.network/docs/learn-architecture) provides shared security and finalisation. Parachains are independent blockchains built with Substrate (https://substrate.io/) that run in parallel, producing blocks validated by relay chain validators. Parachain slots are limited (originally ~100), creating artificial scarcity. Cross-parachain communication uses XCM (Cross-Consensus Messaging) (https://wiki.polkadot.network/docs/learn-xcm) — an asynchronous message format. Radix: Unified Sharding Radix does not separate execution into discrete chains. Instead, all state lives in a unified substate model (/contents/tech/core-concepts/substate-model) across practically unlimited shards, all secured by the same Cerberus consensus. There is no slot scarcity — any developer can deploy Components without acquiring parachain slots or paying auction fees. Cross-Shard Composability Both platforms aim to enable cross-shard/cross-chain interactions, but the guarantees differ: - Polkadot XCM — asynchronous message passing between parachains. Messages are delivered reliably but not atomically — a multi-parachain operation cannot be reverted as a single unit if one step fails. This makes complex DeFi compositions (flash loans, atomic arbitrage) across parachains difficult. - Radix Cerberus — synchronous atomic consensus across shards. A single Transaction Manifest (/contents/tech/core-protocols/transaction-manifests) can atomically interact with Components on different shards, with the entire transaction reverting if any step fails. Developer Experience Polkadot's Substrate framework is powerful but complex — building a parachain requires implementing runtime logic in Rust, understanding FRAME pallets, and managing parachain lifecycle (slot auctions, crowdloans). Substrate is designed for building custom blockchains, which is a different level of abstraction from building dApps. Radix's Scrypto (/contents/tech/core-protocols/scrypto-programming-language) is specifically designed for dApp development. Deploying a Package (/contents/tech/core-concepts/blueprints-and-packages) to Radix is analogous to deploying a smart contract — it does not require running infrastructure or acquiring scarce resources. This makes Radix more accessible for DeFi developers, while Polkadot is more suited for teams building specialised execution environments. External Links - Polkadot Network (https://polkadot.network/) - Polkadot Wiki (https://wiki.polkadot.network/) - Cerberus Whitepaper (https://arxiv.org/pdf/2008.04450) — arXiv ## Radix vs Cosmos URL: https://radix.wiki/contents/tech/comparisons/radix-vs-cosmos Updated: 2026-02-19 Summary: Technical comparison of Radix's sharded single-network approach and Cosmos's app-chain architecture with IBC interoperability. Radix vs Cosmos Category Layer-1 Comparison Radix Model Single sharded network Cosmos Model Sovereign app-chains + IBC (https://ibcprotocol.dev/) Radix Consensus Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Cosmos Consensus CometBFT (https://docs.cometbft.com/) (Tendermint) Introduction Cosmos (https://cosmos.network/) pioneered the multi-chain thesis — a network of sovereign, application-specific blockchains connected via the Inter-Blockchain Communication (IBC) (https://ibcprotocol.dev/) protocol. Radix takes the opposite approach: a single network with native sharding (/contents/tech/core-concepts/sharding) and atomic composability (/contents/tech/core-concepts/atomic-composability) . Both aim to solve scalability, but their philosophies differ fundamentally. Architecture Cosmos: Sovereign App-Chains In Cosmos, each application runs on its own blockchain built with the Cosmos SDK (https://docs.cosmos.network/) . Each chain runs CometBFT (https://docs.cometbft.com/) (formerly Tendermint) consensus independently, with its own validator set and security guarantees. Cross-chain communication occurs through IBC — an asynchronous message-passing protocol that handles token transfers and data between chains. Radix: Single Sharded Network Radix operates as one network where all Components share the same Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) consensus and security guarantees. State is distributed across shards via the substate model (/contents/tech/core-concepts/substate-model) , but all shards participate in a unified consensus process. Cross-shard transactions are atomic — they either fully commit or fully revert across all involved shards. Composability vs Sovereignty The fundamental trade-off is composability vs sovereignty: - Cosmos — each chain is sovereign (controls its own security, governance, and tokenomics) but cross-chain interactions are asynchronous. A DEX swap on Osmosis cannot atomically compose with a lending operation on Umee in a single transaction. IBC transfers take seconds to minutes and introduce bridge risk. - Radix — all Components share security and can atomically compose. A single Transaction Manifest (/contents/tech/core-protocols/transaction-manifests) can orchestrate operations across any Components on the network. The trade-off is that applications cannot have independent security or consensus parameters. For DeFi specifically, atomic composability is valuable — flash loans, multi-hop swaps, and leveraged positions require atomic execution guarantees that IBC cannot provide. Developer Experience Cosmos SDK uses Go for application logic, with each chain requiring its own validator set, security budget, and operational infrastructure. This creates high barriers for launching new applications but grants complete customisation. Radix uses Scrypto (/contents/tech/core-protocols/scrypto-programming-language) (Rust-based) with Blueprints and Packages (/contents/tech/core-concepts/blueprints-and-packages) . Deploying a new application does not require running validators or bootstrapping security — it inherits the network's full security from day one. This dramatically lowers the barrier to deployment but removes the sovereignty that Cosmos chains provide. External Links - Cosmos Network (https://cosmos.network/) - IBC Protocol (https://ibcprotocol.dev/) - What is Cerberus? (https://learn.radixdlt.com/article/what-is-cerberus) — Radix Knowledge Base ## Radix vs Solana URL: https://radix.wiki/contents/tech/comparisons/radix-vs-solana Updated: 2026-02-19 Summary: Technical comparison of Radix and Solana covering consensus, programming model, scalability architecture, and reliability. Radix vs Solana Category Layer-1 Comparison Radix Consensus Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) (parallelised BFT) Solana Consensus Tower BFT + Proof of History (https://solana.com/docs/intro/consensus) Radix Language Scrypto (/contents/tech/core-protocols/scrypto-programming-language) (Rust-based) Solana Language Rust / Anchor (https://www.anchor-lang.com/) Introduction Solana (https://solana.com/) is a high-throughput Layer-1 blockchain that achieves speed through a single-shard, highly optimised architecture. Radix (https://www.radixdlt.com/) targets similar performance goals through multi-shard parallelism. Both prioritise DeFi use cases, but their architectural approaches differ fundamentally. Consensus & Architecture Solana: Single-Shard High Throughput Solana uses Tower BFT (https://solana.com/docs/intro/consensus) (a PBFT variant) combined with Proof of History (PoH) — a verifiable delay function that creates a historical record proving events occurred at specific points in time. This eliminates the need for validators to communicate timestamps, reducing consensus overhead. Solana processes transactions on a single shard at ~4,000 TPS (theoretical max ~65,000), achieving 400ms block times. Radix: Multi-Shard Parallelism Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) parallelises a proven BFT process across practically unlimited shards (/contents/tech/core-concepts/sharding) . Rather than optimising a single pipeline, it runs many pipelines in parallel and "braids" them together only for cross-shard transactions. This provides linear scalability (https://learn.radixdlt.com/article/what-is-cerberus) — throughput grows with nodes — versus Solana's fixed throughput ceiling. Programming Model Both platforms use Rust-based languages, but the paradigms diverge. Solana programs (smart contracts) use raw Rust or the Anchor framework (https://www.anchor-lang.com/) . Programs operate on accounts — data containers owned by programs — in a model where tokens are entries in program-owned accounts (SPL Token standard). Developers must manage account sizing, rent, and manual serialisation. Scrypto uses Asset-Oriented Programming (/contents/tech/core-concepts/asset-oriented-programming) where resources are native engine primitives. Tokens are not entries in account data — they are physical-like objects managed by the engine. This eliminates entire vulnerability classes (reentrancy, approval exploits) and removes the need for manual account management. Reliability Solana has experienced multiple network outages since launch, with the network halting entirely several times. These incidents stem from the single-shard architecture — when the leader validator or critical infrastructure is overwhelmed, the entire network stalls. Radix's multi-shard design mitigates this risk: individual shard disruptions do not halt the entire network. However, Radix's full sharded consensus (Xi'an) has not yet been deployed to mainnet, so a direct reliability comparison at scale is premature. Trade-offs Solana's strengths include a mature, battle-tested ecosystem with significant DeFi TVL, a large developer community, and proven high throughput. Its weaknesses are the throughput ceiling, reliability concerns, and complex programming model. Radix offers superior developer ergonomics and theoretical scalability but has a smaller ecosystem and has not yet proven its sharded architecture at mainnet scale. External Links - Solana Documentation (https://solana.com/docs) - What is Cerberus? (https://learn.radixdlt.com/article/what-is-cerberus) — Radix Knowledge Base - Cerberus Whitepaper (https://arxiv.org/pdf/2008.04450) — arXiv ## Radix vs Ethereum URL: https://radix.wiki/contents/tech/comparisons/radix-vs-ethereum Updated: 2026-02-19 Summary: Technical comparison of Radix and Ethereum across consensus, smart contracts, state model, scalability, and developer experience. Radix vs Ethereum Category Layer-1 Comparison Radix Consensus Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) (parallelised BFT) Ethereum Consensus Gasper (https://ethereum.org/en/developers/docs/consensus-mechanisms/pos/) (PoS + Casper FFG) Radix VM Radix Engine (/contents/tech/core-protocols/radix-engine) (WASM) Ethereum VM EVM (https://ethereum.org/en/developers/docs/evm/) (bytecode) Introduction Ethereum (https://ethereum.org/) is the most widely adopted smart contract platform, with the largest ecosystem of dApps, developers, and total value locked. Radix (https://www.radixdlt.com/) is a purpose-built DeFi platform designed to address Ethereum's limitations in scalability, security, and developer experience. This article compares their architectures objectively. Smart Contract Model Ethereum: EVM + Solidity Ethereum uses the Ethereum Virtual Machine (EVM) (https://ethereum.org/en/developers/docs/evm/) executing Solidity bytecode. Tokens are implemented as smart contracts (ERC-20, ERC-721) — a token is fundamentally a mapping of balances inside a contract's storage. This design has enabled massive innovation but also produces well-known vulnerabilities: reentrancy attacks, approval exploits (approve/transferFrom), and accidental token loss from sending to contracts without withdrawal functions. Radix: Radix Engine + Scrypto Radix uses the Radix Engine (/contents/tech/core-protocols/radix-engine) executing Scrypto (/contents/tech/core-protocols/scrypto-programming-language) (compiled to WebAssembly). Assets are native platform primitives (/contents/tech/core-concepts/asset-oriented-programming) that behave like physical objects — they cannot be copied, accidentally destroyed, or lost. Reentrancy is impossible by design, and there is no approval pattern. The engine enforces resource conservation at the platform level. State Model & Scalability Ethereum: Account Model + L2 Rollups Ethereum uses a global account model where every transaction can potentially touch any contract's state. This requires total global ordering (https://www.radixdlt.com/blog/how-radix-engine-is-designed-to-scale-dapps) of all transactions, creating a fundamental throughput bottleneck (~15 TPS on L1). Ethereum's scaling strategy relies on Layer-2 rollups (Optimistic and ZK) that process transactions off-chain and post proofs to L1, fragmenting liquidity and composability across L2s. Radix: Substate Model + Native Sharding Radix uses a substate model (/contents/tech/core-concepts/substate-model) where state is decomposed into discrete records mapped to shards (/contents/tech/core-concepts/sharding) . Transactions specify which substates they need via intent-based manifests (/contents/tech/core-protocols/transaction-manifests) , enabling parallel processing. Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) "braids" consensus across shards only when needed, providing linear scalability (https://learn.radixdlt.com/article/what-is-cerberus) while preserving atomic composability (/contents/tech/core-concepts/atomic-composability) — no fragmentation. Accounts & Transactions Accounts Ethereum has two account types: Externally Owned Accounts (EOAs) controlled by private keys and smart contract accounts. ERC-4337 (https://eips.ethereum.org/EIPS/eip-4337) adds account abstraction via UserOperations, a separate mempool, Bundler nodes, and an EntryPoint contract — significant additional complexity (https://www.radixdlt.com/blog/comparing-account-abstraction-and-radix-smart-accounts) . On Radix, every account is natively a Smart Account (/contents/tech/core-protocols/smart-accounts) Component with built-in multi-factor auth, social recovery, and configurable deposit rules. Transactions Ethereum transactions contain opaque calldata that wallets cannot meaningfully display to users — users sign data they cannot understand. Radix Transaction Manifests (/contents/tech/core-protocols/transaction-manifests) are human-readable instruction scripts that the Radix Wallet (/contents/tech/core-protocols/radix-wallet) can parse and display, showing users exactly what a transaction will do before they sign. Ecosystem & Maturity Ethereum has clear advantages in ecosystem size: thousands of dApps, hundreds of thousands of developers, and hundreds of billions in TVL. Its EVM has become the de facto standard, supported by major L2s (Arbitrum, Optimism, Base, zkSync), tooling (Hardhat, Foundry, OpenZeppelin), and institutional adoption. Radix's ecosystem is significantly smaller but growing. Its advantage lies in architectural design — purpose-built for DeFi rather than retrofitted — with fewer smart contract vulnerabilities by design and a developer experience that reduces complexity (https://docs.radixdlt.com/docs/blueprints-and-components) for financial applications. The trade-off is a smaller developer community, fewer tools, and less battle-tested infrastructure. External Links - Comparing Account Abstraction and Radix Smart Accounts (https://www.radixdlt.com/blog/comparing-account-abstraction-and-radix-smart-accounts) — Radix Blog - How Radix Engine is Designed to Scale dApps (https://www.radixdlt.com/blog/how-radix-engine-is-designed-to-scale-dapps) — Radix Blog - Ethereum Developer Documentation (https://ethereum.org/en/developers/docs/) ## Radix Core API URL: https://radix.wiki/contents/tech/core-protocols/radix-core-api Updated: 2026-02-18 Summary: Low-level API exposed directly by Radix full nodes for raw ledger access, transaction construction, and node management — primarily used by infrastructure opera Radix Core API Type Low-level node API Exposed by Radix full nodes Audience Infrastructure operators, Gateway providers Higher-level Radix Gateway API (/contents/tech/core-protocols/radix-gateway-api) Documentation Radix Docs (https://docs.radixdlt.com/docs/network-apis) Introduction The Radix Core API is a low-level API exposed directly by Radix full nodes, providing raw access to ledger state, transaction submission, consensus status, and node management. Unlike the Gateway API (/contents/tech/core-protocols/radix-gateway-api) which is designed for dApp developers, the Core API is primarily used by infrastructure operators, Gateway providers, and tools that need direct node communication. Architecture The Core API runs on every Radix full node and provides the lowest-level programmatic access to the network. The Gateway API (/contents/tech/core-protocols/radix-gateway-api) is built on top of the Core API — it ingests data from Core API endpoints, indexes it into a PostgreSQL database, and exposes a higher-level query interface. Key capabilities: - Raw ledger streaming — access committed transactions as they are finalised - Transaction construction — low-level transaction building and submission - Mempool inspection — view pending transactions - Node status — consensus health, sync status, peer information - State queries — direct access to the Substate Model (/contents/tech/core-concepts/substate-model) Most dApp developers should use the Gateway API (/contents/tech/core-protocols/radix-gateway-api) or the TypeScript SDK (https://docs.radixdlt.com/docs/gateway-sdk) rather than the Core API directly, as the Gateway provides pagination, indexing, and a more convenient query model. External Links - Network APIs (/developers/legacy-docs/integrate/network-apis/network-apis) — Radix Documentation - babylon-node (https://github.com/radixdlt/babylon-node) — GitHub Repository ## Radix Gateway API URL: https://radix.wiki/contents/tech/core-protocols/radix-gateway-api Updated: 2026-02-18 Summary: JSON-based HTTP API enabling dApps, wallets, and explorers to query ledger state, submit transactions, and stream network activity without running a full node. Radix Gateway API Type JSON HTTP API Current Version v1.10.4 Protocol HTTP POST (all endpoints) SDK @radixdlt/babylon-gateway-api-sdk (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) API Reference Redoc Documentation (https://radix-babylon-gateway-api.redoc.ly/) Source Code GitHub (https://github.com/radixdlt/babylon-gateway) Introduction The Radix Gateway API is a JSON-based HTTP API layer exposed by the Babylon Radix Gateway that enables dApp frontends, wallets (/contents/tech/core-protocols/radix-wallet) , and explorers to efficiently query current and historical ledger state, submit transactions, and stream network activity — all without running a full node. The Gateway sits between dApp clients and the Radix Core API (/contents/tech/core-protocols/radix-core-api) (which communicates directly with validator nodes (/contents/tech/core-concepts/validator-nodes) ), providing a developer-friendly abstraction layer with features like cursor-based pagination, historical state browsing, and consistent reads. API Groups The Gateway API is organised into five main groups, all using HTTP POST: - Status (/status/*) — gateway and network configuration, health checks - Transaction (/transaction/*) — construction metadata, preview (dry-run), submission, and status tracking - Stream (/stream/*) — read committed transactions with filters (by account, resource, badge, event type, manifest class) - State (/state/*) — query current and historical entity state (accounts, components, resources, validators, key-value stores, NFT data) - Statistics (/statistics/*) — network statistics, validator uptime Additional Extensions endpoints provide specialised queries like resource holder lists and role requirement lookups. Developer Usage Most Radix dApp developers interact with the Gateway through the TypeScript SDK (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) . If using the Radix dApp Toolkit (/contents/tech/core-protocols/radix-connect) , the Gateway SDK is already integrated and does not need separate import. Key features for developers: - Historical state — browse entity state at any past ledger version via at_ledger_state - Consistent reads — pin multiple queries to the same ledger version snapshot - Opt-in properties — reduce bandwidth by requesting only needed fields - Transaction preview — dry-run manifests before submission to check outcomes Providers The official Radix gateway is the primary provider, but third-party services including RadixAPI and NowNodes also offer Gateway API access for redundancy and geographic distribution. External Links - Gateway API Reference (https://radix-babylon-gateway-api.redoc.ly/) — Redoc Documentation - Gateway SDK (/developers/legacy-docs/build/build-dapps/dapp-application-stack/dapp-sdks/gateway-sdk) — Radix Documentation - Network APIs (/developers/legacy-docs/integrate/network-apis/network-apis) — Radix Documentation - babylon-gateway (https://github.com/radixdlt/babylon-gateway) — GitHub Repository ## Liquid Stake Units (LSUs) URL: https://radix.wiki/contents/tech/core-concepts/liquid-stake-units Updated: 2026-02-18 Summary: Fungible tokens representing staked XRD positions that can be freely traded or used as DeFi collateral while the underlying XRD continues earning staking reward Liquid Stake Units (LSUs) Type Fungible resource token Issued by Validator Components Backed by Staked XRD + accrued emissions Tradeable Yes (native Radix resource) Unstaking Delay ~2,016 epochs (~2 weeks) Dashboard Radix Dashboard (https://dashboard.radixdlt.com/network-staking) Introduction Liquid Stake Units (LSUs) are fungible tokens issued by Validator (/contents/tech/core-concepts/validator-nodes) Components when XRD is staked. Each validator issues its own unique LSU resource, representing a proportional claim on the validator's total staked XRD plus accrued emissions (/contents/tech/core-concepts/network-emissions) rewards. Unlike traditional staking models where staked tokens are locked and illiquid, LSUs are standard Radix resources (/contents/tech/core-concepts/asset-oriented-programming) that can be freely transferred, traded on DEXes, or used as collateral in DeFi protocols — all while the underlying XRD continues earning staking rewards. How LSUs Work When a user stakes XRD to a validator, the Validator Component mints LSU tokens and deposits them into the user's account. The exchange rate between LSU and XRD increases over time as emissions rewards accrue. For example, if 1 LSU was initially worth 1 XRD, after a period of emissions it might be worth 1.05 XRD — the protocol auto-compounds rewards by increasing the redemption value. To unstake, a user returns their LSUs to the Validator Component, which burns them and initiates an unstaking delay of approximately 2,016 epochs (~2 weeks). After the delay, the user receives their proportional share of XRD. DeFi Composability Because LSUs are standard resources, they integrate natively with the Radix DeFi ecosystem: - DEX trading — LSUs can be swapped on Ociswap (/ecosystem/ociswap) , CaviarNine (/ecosystem/caviarnine) , and other DEXes - Lending collateral — protocols like Weft Finance (/ecosystem/weft-finance) accept LSUs as collateral - Liquidity provision — LSU/XRD pools allow liquid staking liquidity This composability means stakers no longer face a binary choice between earning staking rewards and participating in DeFi. External Links - Tokens and Tokenomics (https://learn.radixdlt.com/article/start-here-radix-tokens-and-tokenomics) — Radix Knowledge Base - Network Staking Dashboard (https://dashboard.radixdlt.com/network-staking) — Radix Dashboard - Staking & Validating (https://www.radixdlt.com/blog/staking-validating-what-you-need-to-know) — Radix Blog ## Substate Model URL: https://radix.wiki/contents/tech/core-concepts/substate-model Updated: 2026-02-18 Summary: Radix's state architecture where all ledger state is decomposed into discrete, typed records called substates, enabling parallelised consensus across shards. Substate Model Category State storage architecture Enforced by Radix Engine (/contents/tech/core-protocols/radix-engine) FSM Comparison Hybrid of UTXO + Account models Consensus Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) (per-shard) Reference Radix Blog (https://www.radixdlt.com/blog/how-radix-engine-is-designed-to-scale-dapps) Introduction The Substate Model is Radix's state storage architecture where all ledger state is decomposed into discrete, typed records called substates. Each substate is subject to specific rules enforced by the Radix Engine (/contents/tech/core-protocols/radix-engine) 's finite state machine (/contents/tech/core-concepts/finite-state-machines) constraint model. This architecture combines properties of both the UTXO (/contents/tech/core-concepts/unspent-transaction-output-utxo-model) and account models while enabling parallelised consensus. Structure The Radix Engine state model is structured as a forest of state sub-trees. Each sub-tree consists of entities (identified by unique addresses) that have zero or more substates at keyed positions. Substates are typed — for example, a token balance substate tracks ownership and enforces conservation (if tokens leave Alice, they must arrive at Bob). Entities can own other entities, forming ownership trees. A Component entity might own multiple Vault entities, each containing resource substates. This hierarchical structure maps naturally to shards (/contents/tech/core-concepts/sharding) — each Component's state can reside on a dedicated shard with full throughput. Comparison to Other Models vs. UTXO (Bitcoin, Cardano) Like Bitcoin's UTXO model, Radix transactions specify substates as inputs. However, unlike pure UTXO where transactions must reference specific UTXOs (causing contention when multiple transactions target the same UTXO), Radix transactions express intent rather than specific substates. The Radix Engine resolves which substates satisfy the intent at execution time. This is significant: on Cardano, the first DEX ( SundaeSwap (https://en.wikipedia.org/wiki/SundaeSwap) ) experienced severe contention because multiple users tried to consume the same UTXOs. Radix's intent-based approach avoids this entirely. vs. Account Model (Ethereum) Unlike Ethereum where all token state lives inside a single contract's storage, Radix distributes state across substates assignable to different shards. This eliminates single-contract bottlenecks — on Ethereum, every Uniswap trade goes through one contract's state, creating a serialisation point that limits throughput. External Links - How Radix Engine is Designed to Scale dApps (https://www.radixdlt.com/blog/how-radix-engine-is-designed-to-scale-dapps) — Radix Blog - What is Radix Engine? (https://learn.radixdlt.com/article/what-is-radix-engine) — Radix Knowledge Base ## Byzantine Fault Tolerance URL: https://radix.wiki/contents/tech/core-concepts/byzantine-fault-tolerance Updated: 2026-02-18 Summary: The ability of a distributed system to reach consensus correctly even when up to one-third of participants are faulty or malicious, as implemented through Cerbe Byzantine Fault Tolerance Origin Byzantine Generals Problem (https://en.wikipedia.org/wiki/Byzantine_fault) (1982) Threshold Tolerates < 1/3 faulty nodes Radix Implementation Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Whitepaper arXiv:2008.04450 (https://arxiv.org/pdf/2008.04450) Introduction Byzantine Fault Tolerance (BFT) is the ability of a distributed system to reach consensus correctly even when up to one-third of participating nodes are faulty or malicious. The term originates from the Byzantine Generals Problem (https://en.wikipedia.org/wiki/Byzantine_fault) , a thought experiment by Lamport, Shostak, and Pease (1982) about coordinating action among parties who may include traitors. In blockchain systems, BFT ensures that the network can agree on the state of the ledger even if some validators are offline, sending conflicting messages, or actively attempting to disrupt consensus. BFT protocols guarantee safety (no incorrect state) as long as fewer than 1/3 of participants are Byzantine. Classical BFT and Its Limitations Classical BFT protocols like PBFT (https://en.wikipedia.org/wiki/Practical_Byzantine_fault_tolerance) (Practical Byzantine Fault Tolerance) run a single consensus pipeline where 2/3+ of validators must agree. While provably secure, this creates a throughput bottleneck — all transactions flow through one pipeline, and the communication complexity grows quadratically with the number of validators (O(n²)). This is the core tension in blockchain design: proven BFT security comes at the cost of limited throughput. Many Layer-1 blockchains "solve" this by reducing the validator count or using probabilistic finality, trading security for performance. Cerberus: Parallelised BFT Radix's Cerberus protocol (/contents/tech/core-protocols/cerberus-consensus-protocol) takes a different approach: it takes a proven single-pipe BFT consensus process and parallelises it across a practically unlimited number of shards (/contents/tech/core-concepts/sharding) . Each shard runs its own local BFT consensus to order transactions on its substates. When a transaction touches state on multiple shards, Cerberus "braids" consensus across those shards using an atomic commitment protocol. This ensures that either all shards commit the transaction or none do — preserving atomic composability (/contents/tech/core-concepts/atomic-composability) across the entire network. This design provides linear scalability: throughput increases proportionally as more nodes and shards are added. The Cerberus whitepaper (https://arxiv.org/pdf/2008.04450) demonstrated that the protocol can handle millions of transactions per second in simulation, outperforming other sharded consensus approaches while maintaining the full security guarantees of classical BFT. External Links - Byzantine Fault (https://en.wikipedia.org/wiki/Byzantine_fault) — Wikipedia - What is Cerberus? (https://learn.radixdlt.com/article/what-is-cerberus) — Radix Knowledge Base - Cerberus Whitepaper (https://arxiv.org/pdf/2008.04450) — arXiv ## Component Royalties URL: https://radix.wiki/contents/tech/core-concepts/component-royalties Updated: 2026-02-18 Summary: Protocol-native mechanism allowing developers to earn recurring XRD revenue every time a transaction calls methods on their deployed Blueprints or Components. Component Royalties Type Developer incentive mechanism Denominations XRD or approximate USD Scope Per-function / per-method Mutability Lockable or updatable Documentation Radix Docs (https://docs.radixdlt.com/docs/using-royalties) Introduction Component Royalties are an on-ledger, protocol-native mechanism that allows developers to earn recurring XRD (/contents/tech/core-protocols/xrd-token) revenue every time a transaction interacts with their deployed Blueprints or Components (/contents/tech/core-concepts/blueprints-and-packages) . Unlike grant programs or one-time funding, royalties scale with ecosystem usage and are enforced by the protocol itself. This creates a sustainable developer incentive system (https://learn.radixdlt.com/article/what-is-the-radix-developer-royalties-system) where building widely-used infrastructure is directly rewarded — every piece of code that contributes to a transaction can earn its share. How Royalties Work Developers set royalty amounts on individual functions (Blueprint-level) and methods (Component-level). Royalties are charged as part of the transaction fee whenever that function or method is called. Royalties can be denominated in: - XRD — a fixed amount per call (e.g., 1 XRD) - Approximate USD — a dollar-equivalent amount (e.g., $0.05 per call), where the protocol uses a constant multiplier to convert USD to XRD at execution time - Free — no royalty charged A single DeFi transaction that involves multiple Components — say a DEX, a lending pool, and an oracle — pays royalties to each Component's developer. Revenue grows linearly with transaction volume. Mutability & Trust Royalties can be marked as locked (immutable forever) or updatable (the developer can change or later lock them). Locked royalties provide certainty to users and integrators that fees will not increase. Updatable royalties may discourage usage, since the developer could raise fees unpredictably. This opt-in model means developers must balance revenue aspirations against user trust — locking royalties at launch signals commitment, while keeping them updatable provides flexibility. External Links - Using Royalties (/developers/legacy-docs/build/scrypto-1/royalties/using-royalties) — Radix Documentation - Developer Royalties System (https://learn.radixdlt.com/article/what-is-the-radix-developer-royalties-system) — Radix Knowledge Base - On-Ledger Recurring Developer Revenue (https://radixdlt.medium.com/on-ledger-recurring-developer-revenue-incentives-to-buidl-bfc0ba03dd1b) — Radix Medium ## Network Emissions URL: https://radix.wiki/contents/tech/core-concepts/network-emissions Updated: 2026-02-18 Summary: Protocol-level minting of new XRD distributed as staking rewards to validators and delegators, constituting Radix's inflation mechanism and security incentive. Network Emissions Total Supply Cap 24 billion XRD Genesis Allocation 12 billion (July 2021) Emissions Pool ~12 billion over ~40 years Annual Rate ~300 million XRD/year Fee Burn Rate 50% of base transaction fees Reference Radix Tokenomics (https://learn.radixdlt.com/article/start-here-radix-tokens-and-tokenomics) Introduction Network Emissions are the protocol-level minting of new $XRD (/contents/tech/core-protocols/xrd-token) tokens distributed as staking (/contents/tech/core-concepts/staking) rewards to validators (/contents/tech/core-concepts/validator-nodes) and their delegators. This mechanism provides the primary economic incentive for securing the Radix network through Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) . Supply & Distribution XRD has a hard cap of 24 billion tokens. At genesis in July 2021, 12 billion were allocated: 9.6 billion unlocked and circulating, and 2.4 billion locked indefinitely in a stable coin reserve. The remaining 12 billion are minted as emissions over approximately 40 years at a rate of roughly 300 million XRD per year. Emissions for each epoch are allocated to active validators proportional to their share of total staked XRD. Validators take their configured fee percentage, and the remainder flows to delegators proportionally. The protocol auto-compounds staker rewards by re-staking emission XRD, increasing the value of existing Liquid Stake Units (/contents/tech/core-concepts/liquid-stake-units) . Eligibility - Only the top 100 validators in the Active Validator Set receive emissions - Validators with uptime below 98% receive zero emissions for that epoch - This creates strong incentives for high availability and distributed staking Fee Burning 50% of base network transaction fees are burned (permanently destroyed), creating deflationary pressure that partially offsets emissions inflation. As network usage grows, the burn rate increases — potentially reducing net inflation or even making XRD deflationary at scale. This dual mechanism — inflationary emissions for security, deflationary fee burning for value — aims to balance network security incentives with long-term token value. External Links - Tokens and Tokenomics (https://learn.radixdlt.com/article/start-here-radix-tokens-and-tokenomics) — Radix Knowledge Base - Staking & Validating (https://www.radixdlt.com/blog/staking-validating-what-you-need-to-know) — Radix Blog ## Validator Nodes URL: https://radix.wiki/contents/tech/core-concepts/validator-nodes Updated: 2026-02-18 Summary: Network participants running Cerberus consensus, selected via Delegated Proof of Stake where the top 100 validators by staked XRD form the Active Validator Set. Validator Nodes Consensus Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Selection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) Active Set Size 100 validators Epoch Length 10,000 rounds Uptime Threshold 98% Dashboard Radix Dashboard (https://dashboard.radixdlt.com/network-staking) Introduction Validator Nodes are network participants that run the Cerberus consensus protocol (/contents/tech/core-protocols/cerberus-consensus-protocol) to propose and validate transactions on the Radix network. They are selected through a Delegated Proof of Stake (DPoS) (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) mechanism where XRD holders delegate their tokens to validators they trust. The top 100 validators by total delegated stake form the Active Validator Set at the start of each epoch. Only active validators participate in consensus and receive network emissions (/contents/tech/core-concepts/network-emissions) . Consensus & Epochs Consensus proceeds in epochs, each consisting of 10,000 rounds. In each round, one validator is the "leader" responsible for proposing consensus. Validators take turns as leader with probability proportional to their stake. If a validator is offline during its leader rounds, those rounds fail — blocking transaction commitment until the next healthy validator's round begins. Validators must maintain at least 98% uptime; below this threshold, the protocol provides zero emissions rewards to that validator and its stakers for the epoch. Revenue Sources - A share of network emissions (/contents/tech/core-concepts/network-emissions) proportional to stake - A configurable validator fee (percentage of staker emissions) - A portion of transaction fees and tips Registration & Operation Anyone can register as a validator by running a full node (https://docs.radixdlt.com/docs/node-registering-as-a-validator) and submitting a registration transaction. Validators are on-ledger Components with configurable metadata (name, URL, icon), a fee percentage, and optional staker acceptance rules. A validator enters the Active Set only if it accumulates enough delegated stake to rank in the top 100. Validators below this threshold remain registered but do not participate in consensus or earn rewards. When XRD holders stake (/contents/tech/core-concepts/staking) to a validator, they receive Liquid Stake Units (LSUs) (/contents/tech/core-concepts/liquid-stake-units) — fungible tokens representing their staked position — which can be traded, used as collateral in DeFi, or unstaked (subject to a delay period). External Links - Validator (/developers/legacy-docs/reference/radix-engine/native-blueprints/validator) — Radix Documentation - How Should I Choose Validators? (https://learn.radixdlt.com/article/how-should-i-choose-validators-to-stake-to) — Radix Knowledge Base - Network Staking Dashboard (https://dashboard.radixdlt.com/network-staking) — Radix Dashboard ## Access Rules & Auth Zones URL: https://radix.wiki/contents/tech/core-concepts/access-rules-and-auth-zones Updated: 2026-02-18 Summary: Radix's declarative authorisation model where Components define what badge Proofs must be present in the caller's Auth Zone for a method call to succeed. Access Rules & Auth Zones Category Authorisation model Mechanism Badge-based Proof verification Scope Per-method, per-Component Documentation Radix Docs (https://docs.radixdlt.com/docs/auth) Introduction Access Rules are declarative authorisation policies attached to Blueprint (/contents/tech/core-concepts/blueprints-and-packages) and Component methods. They define what Proofs (/contents/tech/core-concepts/buckets-proofs-and-vaults) must be present for a call to succeed. Auth Zones are per-call-frame containers that hold Proofs during Transaction Manifest (/contents/tech/core-protocols/transaction-manifests) execution, forming Radix's native authorisation system. This model replaces Ethereum's msg.sender pattern with a more flexible, composable approach. Instead of checking "who called this function", Radix checks "what Proofs are available" — enabling multi-factor authentication, role-based access, and delegated authority natively. Badges A badge is any resource — fungible token or NFT — that a Component's Access Rules reference for authorisation. There is nothing structurally special about a badge; it is simply a resource used for access control. For example, an "admin badge" NFT might be required to call a set_price method on a DEX Component. Access Rules support composite logic: - require(resource) — must present Proof of a specific resource - require_amount(n, resource) — must prove ownership of at least n units - require_any_of / require_all_of — OR / AND composition of multiple badge requirements - require_n_of(n, resources) — threshold (N of M) badge requirements Auth Zone Mechanics Each call frame in a transaction has its own Auth Zone — a stack of Proofs. When a Transaction Manifest creates a Proof (e.g., from an Account's Vault) and pushes it, subsequent method calls in the same frame can see it. Proof propagation rules provide security guarantees: - Proofs move up the call stack freely — a method can return a Proof to the manifest's Auth Zone - Proofs move down the stack once — if the manifest pushes a Proof and calls Component A, then Component B, both see the Proof - But if Component B internally calls Component C, Proofs from the manifest do not propagate to C's frame — preventing unintended authorisation delegation Access Rules can be configured as locked (immutable) or updatable, and support named roles (e.g., "admin", "minter") that map to different badge requirements. Smart Accounts (/contents/tech/core-protocols/smart-accounts) use a specialised Access Controller Component that implements multi-factor authentication using this same mechanism. External Links - Proofs / Auth (https://docs.radixdlt.com/docs/auth) — Radix Documentation - Authorization Model (/developers/legacy-docs/reference/radix-engine/authorization-model) — Radix Documentation - Multi-Factor Smart Accounts (https://www.radixdlt.com/blog/how-radix-multi-factor-smart-accounts-work-and-what-they-can-do) — Radix Blog ## Buckets, Proofs & Vaults URL: https://radix.wiki/contents/tech/core-concepts/buckets-proofs-and-vaults Updated: 2026-02-18 Summary: The three core asset containers in Scrypto: Vaults for permanent storage, Buckets for in-transaction movement, and Proofs for authorization attestation. Buckets, Proofs & Vaults Category Asset container primitives Paradigm Asset-Oriented Programming (/contents/tech/core-concepts/asset-oriented-programming) Runtime Radix Engine (/contents/tech/core-protocols/radix-engine) Documentation Radix Docs (https://docs.radixdlt.com/docs/buckets-and-vaults) Introduction Buckets, Vaults, and Proofs are the three core asset containers in Scrypto (/contents/tech/core-protocols/scrypto-programming-language) , each serving a distinct purpose in Asset-Oriented Programming (/contents/tech/core-concepts/asset-oriented-programming) . Together they form the mechanism by which resources are stored, moved, and used for authorisation on the Radix network. Vaults A Vault is a permanent, on-ledger container that holds a single type of resource (fungible tokens or non-fungible tokens (/contents/tech/core-protocols/nfts-on-radix) ). Vaults are owned by Components — including Account components (/contents/tech/core-protocols/smart-accounts) — and represent the resting state of all assets on the network. At the end of every transaction, all resources must reside in Vaults. Any resources left outside a Vault cause the transaction to fail, preventing accidental token loss. A Component may hold multiple Vaults, but each Vault holds only one resource type. Buckets A Bucket is a temporary, in-transaction container for moving resources between Components. Buckets exist only during transaction execution — they are created when resources are withdrawn from a Vault and destroyed when deposited into another Vault. The lifecycle of a Bucket within a Transaction Manifest (/contents/tech/core-protocols/transaction-manifests) : - Withdraw resources from an Account Vault → creates a Bucket - Pass the Bucket to a Component method → transfers ownership - The Component deposits the Bucket into its own Vault The Radix Engine enforces that all Buckets must be empty or deposited by the end of the transaction. This guarantee — enforced at the platform level, not by application code — makes it impossible to accidentally lose tokens by sending them to the wrong address. Proofs A Proof is a non-transferable cryptographic attestation created from a Vault or Bucket, proving that the caller possesses a certain quantity or type of resource. Proofs do not move the underlying assets — they attest to their existence. Proofs are placed onto an Auth Zone (/contents/tech/core-concepts/access-rules-and-auth-zones) during transaction execution, where they are checked against Component Access Rules (/contents/tech/core-concepts/access-rules-and-auth-zones) . This is how authorisation works on Radix: rather than checking msg.sender as on Ethereum, a Scrypto method requires a Proof of a specific badge resource. Virtual Proofs are automatically created by the Radix Engine from transaction signatures. When a user signs a transaction with their wallet key, the engine creates a virtual Proof of the account's signature badge, linking traditional cryptographic signing to Radix's badge-based authorisation. External Links - Buckets and Vaults (/developers/legacy-docs/build/scrypto-1/resources/buckets-and-vaults) — Radix Documentation - Resources (/developers/legacy-docs/build/scrypto-1/resources/resources) — Radix Documentation - Proofs and Auth (https://docs.radixdlt.com/docs/auth) — Radix Documentation ## Blueprints & Packages URL: https://radix.wiki/contents/tech/core-concepts/blueprints-and-packages Updated: 2026-02-18 Summary: Scrypto's deployment model where reusable Blueprint templates are bundled into Packages, from which on-ledger Component instances are created. Blueprints & Packages Concept Smart contract deployment model Analogy Class → Instance (OOP) Compilation Rust/Scrypto → WebAssembly Runtime Radix Engine (/contents/tech/core-protocols/radix-engine) Documentation Radix Docs (https://docs.radixdlt.com/docs/blueprints-and-components) Introduction In the Radix ecosystem, a Blueprint is a reusable template — analogous to a class in object-oriented programming — from which Components (on-ledger smart contract instances) are created. One or more Blueprints are bundled into a Package, which is the deployable unit on the Radix network. This model separates code deployment from instantiation. A developer publishes a Package once, and anyone on the network can then instantiate Components (https://docs.radixdlt.com/docs/blueprints-and-components) from its Blueprints — each with its own on-ledger address, state, and Vaults (/contents/tech/core-concepts/buckets-proofs-and-vaults) holding resources. Architecture Blueprints A Blueprint defines: - State fields — typed data the Component stores (e.g., a Vault for collected fees, a u64 counter) - Functions — stateless methods callable without a Component instance (including constructors like instantiate_*) - Methods — stateful methods that operate on a specific Component instance - Events — typed events emitted during execution for off-chain indexing Blueprints can also declare access rules (/contents/tech/core-concepts/access-rules-and-auth-zones) , royalties (/contents/tech/core-concepts/component-royalties) , and metadata at both the Blueprint and Component level. Packages Scrypto (/contents/tech/core-protocols/scrypto-programming-language) source code is compiled into WebAssembly (https://webassembly.org/) and bundled into a Package. Once published to the ledger, the Package receives a unique on-ledger address (e.g., package_rdx1...). All Blueprints within the Package share the same address namespace. The Radix network includes several native Packages — system-provided Blueprints for Accounts, Validators, Access Controllers, and Resources — that form the platform's built-in functionality. Component Lifecycle - Develop — write Scrypto Blueprints using Rust tooling - Test — use the Scrypto test framework (https://docs.radixdlt.com/docs/getting-rust-scrypto) with the ledger simulator - Publish — deploy the compiled Package to the Radix ledger via a Transaction Manifest (/contents/tech/core-protocols/transaction-manifests) - Instantiate — call a Blueprint's constructor function to create a Component - Interact — call the Component's methods via Transaction Manifests External Links - Blueprints and Components (/developers/legacy-docs/build/scrypto-1/blueprints-and-components) — Radix Documentation - Your First Scrypto Project (https://docs.radixdlt.com/docs/learning-to-explain-your-first-scrypto-project) — Radix Documentation - radixdlt-scrypto (https://github.com/radixdlt/radixdlt-scrypto) — GitHub Repository ## Asset-Oriented Programming URL: https://radix.wiki/contents/tech/core-concepts/asset-oriented-programming Updated: 2026-02-18 Summary: Programming paradigm where digital assets are first-class platform primitives that behave like physical objects, eliminating entire classes of smart contract vu Asset-Oriented Programming Paradigm Resource-centric smart contracts Introduced Radix Babylon (2023) Language Scrypto (/contents/tech/core-protocols/scrypto-programming-language) Runtime Radix Engine (/contents/tech/core-protocols/radix-engine) Specification Radix Blog (https://www.radixdlt.com/blog/scrypto-an-asset-oriented-smart-contract-language) Introduction Asset-Oriented Programming is the foundational programming paradigm of the Radix platform (/contents/tech/core-protocols/radix-engine) , in which digital assets — tokens, NFTs, badges — are treated as native, first-class primitives governed by the execution environment rather than by application code. Unlike the Ethereum account model (https://ethereum.org/en/developers/docs/accounts/) where a token is merely a number in a mapping inside a smart contract (balances[address] = 100), on Radix resources are physical-like objects that can never be copied, accidentally destroyed, or lost. This design is enforced by the Radix Engine (/contents/tech/core-protocols/radix-engine) 's finite state machine (/contents/tech/core-concepts/finite-state-machines) constraint model. When a Scrypto (/contents/tech/core-protocols/scrypto-programming-language) component method accepts a Bucket (/contents/tech/core-concepts/buckets-proofs-and-vaults) of tokens, ownership of those tokens physically transfers to the component at the engine level — not by updating a balance entry, but by moving a container. How It Differs from Ethereum On Ethereum, transferring ERC-20 tokens involves calling the token contract's transfer function, which decrements the sender's balance and increments the receiver's. This creates several vulnerability classes: - Reentrancy attacks — an external call can re-enter the contract before state updates complete (the cause of the 2016 DAO hack (https://en.wikipedia.org/wiki/The_DAO) ) - Approval exploits — the approve/transferFrom pattern introduces a second attack surface - Accidental token loss — tokens sent to a contract with no withdrawal function are permanently locked Asset-Oriented Programming eliminates these by design. Reentrancy is impossible because resources physically move rather than balances being updated. There is no approval pattern — you pass a Bucket directly. And the Radix Engine enforces that every Bucket must be deposited into a Vault (/contents/tech/core-concepts/buckets-proofs-and-vaults) by the end of a transaction, or the transaction fails. Resources as Platform Primitives Resources on Radix are created via built-in system calls (https://docs.radixdlt.com/docs/resources) with configurable behaviours: mintable, burnable, divisibility, metadata, and more. The platform enforces conservation — if 100 tokens enter a transaction, exactly 100 must leave it (unless authorised minting or burning occurs). This guarantee is provided at the engine level, not by application code, meaning that bugs in Scrypto components cannot violate resource conservation. Because resources are native types, the Radix Wallet (/contents/tech/core-protocols/radix-wallet) can display and manage any token or NFT without requiring per-token integration — unlike Ethereum wallets which must manually add each ERC-20 contract address. External Links - Scrypto: An Asset-Oriented Smart Contract Language (https://www.radixdlt.com/blog/scrypto-an-asset-oriented-smart-contract-language) — Radix Blog - Resources (/developers/legacy-docs/build/scrypto-1/resources/resources) — Radix Documentation - radixdlt-scrypto (https://github.com/radixdlt/radixdlt-scrypto) — GitHub Repository ## Scrypto v1.1.0 URL: https://radix.wiki/developers/legacy-docs/updates/release-notes/scrypto/scrypto-v1-1-0 Updated: 2026-02-18 Summary: Scrypto v1.1.0 release is a version that targets the Anemone network, which introduces the following protocol updates: Updates > Release Notes > Scrypto > Scrypto v1.1.0 — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/scrypto-v1-1-0.md) Summary Scrypto v1.1.0 release is a version that targets the Anemone network, which introduces the following protocol updates: - Second-precision network time - BLS and Keccak256 cryptography - Improved pool blueprints - Reduced validator creation fee Latest Compatible Product Versions By using the binaries, including libraries, CLIs and docker images you agree to the End User License Agreement. You can find all terms and conditions here (https://uploads-ssl.webflow.com/6053f7fca5bf627283b582c2/65006e3d3a8002f9e834320c_radixdlt.com_genericEULA.pdf) . Engine, Node and Scrypto: - Scrypto: v1.1.0 (https://github.com/radixdlt/radixdlt-scrypto/releases/tag/v1.1.0) on Github - Radix Engine Toolkit Core: v1.0.5 (https://github.com/radixdlt/radix-engine-toolkit/releases/tag/v1.0.5) on Github - Node: v1.1.0 (https://github.com/radixdlt/babylon-node/releases/tag/v1.1.0) on Github App Building Tools: - Gateway API: ReDocly (https://radix-babylon-gateway-api.redoc.ly/) | Root URL (https://mainnet-gateway.radixdlt.com/) | Swagger (https://mainnet-gateway.radixdlt.com/swagger/index.html) - dApp Toolkit: 1.4.3 (https://www.npmjs.com/package/@radixdlt/radix-dapp-toolkit/v/1.0.0) on npm - TypeScript Radix Engine Toolkit: v1.0.3 (https://www.npmjs.com/package/@radixdlt/radix-engine-toolkit/v/1.0.3) on npm - ROLA Library for Node.js backends: 1.0.3 (https://www.npmjs.com/package/@radixdlt/rola/v/1.0.3) on npm Public Applications: - Wallet: Installation guide (https://wallet.radixdlt.com/) - Connector Extension: v1.3.4 (https://github.com/radixdlt/connector-extension/releases/tag/v1.3.4) on Github - Dashboard: Mainnet (https://dashboard.radixdlt.com/) | Stokenet (/contents/tech/releases/stokenet) - Mainnet Network ID: 1 (0x01) - Stokenet Network ID: 2 (0x02) Changes from v1.0.1 Second-precision Timestamp Second-precision network timestamps are now available for Scrypto. See the rustdocs (https://docs.rs/scrypto/latest/scrypto/runtime/struct.Clock.html) of Clock for more. BLS12-381 and Keccak256 Cryptography A new set of crypto APIs have been introduced. Using these APIs is much cheaper than doing the same computation within WASM. For use guides, please refer to Cryptography (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/cryptography/README.md) Pool Blueprints Update All native pool blueprints have been updated to use PreciseDecimal for internal computation. This significantly improves the accuracy of maths for all existing and new pools. No interface has been changed. Validator Creation Fee Reduced Not technically a Scrypto-related change, but it's worth mentioning that the validator creation fee will be reduced to 100 USD once Anemone is activated. RESIM Improvements Various enhancements to the resim CLI: - Show both the name and the symbol of a resource if available. - Make address optional for the resim show command - Check public and private keys provided to resim set-default-account - Support argument building for FungibleBucket and NonFungibleBucket - Fix the issue where resim doesn't read the .rpd schema file correctly Scrypto Coverage Report You can now run scrypto coverage to generate a coverage report of your Scrypto project. ## Scrypto v1.2.0 URL: https://radix.wiki/developers/legacy-docs/updates/release-notes/scrypto/scrypto-v1-2-0 Updated: 2026-02-18 Summary: Scrypto v1.2.0 introduces various features and enhancements, targeting the bottlenose protocol update. Updates > Release Notes > Scrypto > Scrypto v1.2.0 — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/scrypto-v1-2-0.md) Scrypto v1.2.0 introduces various features and enhancements, targeting the bottlenose protocol update. Starting from this version, Scrypto and Radix Engine crates are published to crates.io (https://crates.io/) . New Bottlenose Features Bottlenose protocol update introduces two new features that can be used by Scrypto blueprints. There is a new native blueprint AccountLocker, which makes sending resource to accounts easier. See this doc (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/locker.md) for more details. A corresponding stub is added to Scrypto library. A new method get_owner_role is added to every global component. Here is an example of how to get the owner role of a component: let owner_role = global_component.get_owner_role(); Radix CLIs The simulator is rebranded to radix-clis, featuring the following tools: - resim - A ledger simulator to play with Radix Engine features - scrypto - A tool for creating, building and testing Scryto code - rtmc/rtmd - Radix transaction manifest compiler and decompiler - scrypto-bindgen - A tool for generating stubs for the blueprints of a package Here is the new way of installing Radix CLIs: cargo install --force radix-clis@1.2.0 Pretty Manifest Compilation Error Radix CLI rtmc now prints pretty error message and highlights the problematic parts. New Scrypto Compiler Library A new library scrypto-compiler is introduced to standardize everything around Scrypto compilation. This crate is used by Radix CLIs, scrypto-builder and scrypto-test, to provide consistent behavior. It is now available for public use. Scrypto Testing Various enhancements are made to Scrypto testing libraries. - scrypto-unit is merged into scrypto-test - TestRunner is renamed to LedgerSimulator - TestEnvironmentBuilder is introduced for configuring the environment prior to creation - SubstateDatabaseOverlay is added to enable testing and simulation on top of a real Node database - Package is renamed to PackageFactory - Scrypto tests now can specify a CompilerProfile, indicating whether standard or fast Scrypto compilation should be used Manifest Builder Two bug fixes for the ManifestBuilder utility - Fixed the issue that deposit_batch consumes named buckets, #1702 (https://github.com/radixdlt/radixdlt-scrypto/pull/1702) - Fixed the issue that prevents creation of RUID NFTs, #1701 (https://github.com/radixdlt/radixdlt-scrypto/pull/1701) ## Scrypto v1.3.0 URL: https://radix.wiki/developers/legacy-docs/updates/release-notes/scrypto/scrypto-v1-3-0 Updated: 2026-02-18 Summary: Scrypto v1.3.0 adds support for the cuttlefish(../../protocol-updates/cuttlefish.md) protocol update. Updates > Release Notes > Scrypto > Scrypto v1.3.0 — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/scrypto-v1-3-0.md) Scrypto v1.3.0 adds support for the cuttlefish (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/cuttlefish.md) protocol update. Use the correct rust version Scrypto v1.3.0 should be used with rustc 1.81 or below which can be installed with rustup. Using 1.82.0 and higher will not work with the Cuttlefish engine because it builds WASM with new WASM exetensions which are not yet supported by the Radix execution environment. See Scrypto v1.3.0 on docs.rs (https://docs.rs/scrypto/1.3.0) for full technical documentation. New Features Transaction V2 and Subintents The headline feature of cuttlefish is subintents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) , which function like mini transactions that can be embedded within other transactions. They are complete user intents which are signed separately and can be passed around off-ledger to be assembled with other intents into a complete transaction. - To understand how subintents work, read about the intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) . - To use subintents in your dApp, read about the Pre-authorization Flow (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/pre-authorizations-and-subintents.md) which also covers some example use-cases. Further documentation on working with subintents will follow in the next week. But to start you off, here's an example of building a NotarizedTransactionV2 with a subintent: let subintent = TransactionBuilder::new_partial_v2() .intent_header(IntentHeaderV2 { network_id, start_epoch_inclusive, end_epoch_exclusive, intent_discriminator, min_proposer_timestamp_inclusive, max_proposer_timestamp_exclusive, }) .manifest_builder(|builder| { builder .withdraw_from_account(account, XRD, 10) .take_all_from_worktop(XRD, "xrd") .yield_to_parent_with_name_lookup(|lookup| (lookup.bucket("xrd"),)) }) .sign(&signer_key) .build(); let DetailedNotarizedTransactionV2 { transaction, raw, object_names, transaction_hashes, } = TransactionBuilder::new_v2() .transaction_header(TransactionHeaderV2 { notary_public_key, notary_is_signatory, tip_basis_points, }) .intent_header(IntentHeaderV2 { network_id, start_epoch_inclusive, end_epoch_exclusive, intent_discriminator, min_proposer_timestamp_inclusive, max_proposer_timestamp_exclusive, }) .add_signed_child("child", subintent) .manifest_builder( |builder| { builder .yield_to_child("child", ()) .deposit_entire_worktop(account) }, ) .sign(&signer_key) .notarize(¬ary_key) .build(); To build just a TransactionManifestV2, you can use the new_v2() method on ManifestBuilder, or to build a SubintentManifestV2, you can use the new_subintent_v2() method: let transaction_manifest = ManifestBuilder::new_v2() .use_child("child", subintent_hash) .lock_standard_test_fee(account) .yield_to_child("child", ()) .build(); let subintent_manifest = ManifestBuilder::new_subintent_v2() .yield_to_parent(()) .build(); Subintent and Transaction building considerations - Subintents can only commit successfully once, but can be committed inside failing transactions many times before that (this prevents someone maliciously failing your subintent permanently). To prevent issues with fee drainage, fees cannot be locked inside a subintent (contingent fees can however be locked). Therefore the root transaction intent must pay the fees from the transaction. - Each transaction has one "transaction intent" and zero or more "non-root subintents" - Together, these are called the "intents" of a transaction. - The intents of a transaction form a conceptual tree, with the transaction intent at the root, and each intent can have zero or more subintent children. - All subintents in the transaction must be reachable from the root. - In the type model, the non-root subintents are stored into a normalized / flattened array. A subintent can therefore be addressed by its SubintentHash, and (in the context of a given transaction), by its SubintentIndex, which is its 0-based index into this array. - Subintents must YIELD_TO_PARENT exactly as many times as their parent intent targets them with a YIELD_TO_CHILD (see below). - Subintents must end with a YIELD_TO_PARENT instruction. - Structural limits: - A transaction can only have an intent depth of 4 (a transaction intent root, and three additional levels of subintents) - A transaction can have a maximum of 32 total subintents, and 64 signatures across these subintents - Each manifest can have a maximum of 1000 instructions V2 Preview The TransactionBuilder::new_v2() can also be used to build a PreviewTransactionV2, with the build_preview_transaction(transaction_intent_signer_public_keys) method, which can be used before signing/notarizing. The PreviewTransactionV2 can be used in the new preview-v2 API on the Core API (https://radix-babylon-core-api.redoc.ly/#tag/Transaction/paths/~1transaction~1preview-v2/post) or Gateway API (https://radix-babylon-gateway-api.redoc.ly/#operation/TransactionPreviewV2) . New Manifest Instructions Notes These instructions are only available in V2 manifests, which is supported only by SubintentManifestV2 and TransactionManifestV2. For the time being, the Radix Wallet will only support: - Childless SubintentManifestV2 stubs for pre-authorization requests - TransactionManifestV1 stubs for transaction requests Therefore the new instructions cannot currently be used when making transaction requests to the wallet. A brief overview of the new V2 instructions are as follows. Full detail and examples are given in the manifest instructions article (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) . Examples of subintent and transaction manifests can be found here (https://github.com/radixdlt/radixdlt-scrypto/tree/main/radix-transaction-scenarios/generated-examples/cuttlefish/basic_subintents/manifests) . Instruction Description USE_CHILD A pseudo-instruction that must be at the start of a written manifest. It declares the children of the intent, by specifying their subintent hash, and giving them a name to be used in the rest of the manifest. VERIFY_PARENT Verifies that the parent intent's auth zone satisfies the given access rule. This can be used for a counterparty check. For example, it can be used to require that the subintent is used by a particular dApp. YIELD_TO_PARENT Yields execution to the parent intent. This instruction takes any number of arguments. This can be used to pass buckets to this instruction. This instruction can also receive buckets onto the worktop. YIELD_TO_CHILD Yields execution to a child intent. This instruction takes any number of arguments. This can be used to pass buckets to this instruction. This instruction can also receive buckets onto the worktop. ASSERT_WORKTOP_IS_EMPTY Asserts that the worktop is empty. ASSERT_WORKTOP_RESOURCES_INCLUDE Asserts that the worktop contains the specified resources, with the given constraints for each resource. It may also return other resources. ASSERT_WORKTOP_RESOURCES_ONLY Asserts that the worktop contains ONLY the specified resources, with the given constraints for each resource. ASSERT_NEXT_CALL_RETURNS_INCLUDE Asserts that the next call returns the specified resources, with the given constraints for each resource. It may also return other resources. A call is any method or function call instruction, or a YIELD_... instruction. ASSERT_NEXT_CALL_RETURNS_ONLY Asserts that the next call returns ONLY the specified resources, with the given constraints for each resource. ASSERT_BUCKET_CONTENTS Asserts that the bucket's contents meet the given constraints. More Crypto Utils More cryptographic primitive utilities have been added to the Radix Engine, available in CryptoUtils (https://docs.rs/scrypto/1.3.0/scrypto/crypto_utils/struct.CryptoUtils.html) in Scrypto: CryptoUtils::blake2b_256_hash(&data); CryptoUtils::ed25519_verify(&message, &pub_key, &signature); CryptoUtils::secp256k1_ecdsa_verify(&hash, &pub_key, &signature); CryptoUtils::secp256k1_ecdsa_verify_and_key_recover(&hash, &signature); CryptoUtils::secp256k1_ecdsa_verify_and_key_recover_uncompressed(&hash, &signature); Getter Methods on Account Blueprint New getter methods have been added to the Account (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) blueprint, allowing on-chain account balance lookup. The Account (https://docs.rs/scrypto/1.3.0/scrypto/component/struct.Account.html) stub has been updated with these new methods for easy access: fn balance(&self, resource_address: ResourceAddress) -> Decimal; fn non_fungible_local_ids( &self, resource_address: ResourceAddress, limit: u32, ) -> Vec; fn has_non_fungible( &self, resource_address: ResourceAddress, local_id: NonFungibleLocalId, ) -> bool; Code Changes Some tweaks have been made to the various scrypto and engine libraries, which may need to be updated in your code. Notably changes include the following. If you are hitting other compilation errors and need advice, please ask in the dev-lounge channel on Discord (https://discord.com/channels/417762285172555786/803425066678222870) or the Developer Telegram Channel (https://t.me/RadixDevelopers) , and we can advise or update this article with more details. Renames to AccessRule models - AccessRuleNode has been renamed to CompositeRequirement - ProofRule has been renamed to BasicRequirement We have updated this across the stack, although some APIs (e.g. the Core API and SBOR annotated programmatic JSON) may still refer to the old names in some places due to backwards-compatibility requirements. And the old names are still accepted as Enum descriptors in Manifest Value syntax. See advanced access rules (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) for more information. Renames to "virtual" entity types and addresses The old naming caused confusion, and these are now known as "pre-allocated" addresses. This has caused changes to the EntityType enum and various functions on entity type and ComponentAddress. As an example, a "virtual account" can now be called an "uninstantiated pre-allocated account" if it's yet to be instantiated on ledger, or an "instantiated pre-allocated account" once created. We have updated this across the stack, although some APIs (e.g. Core API and SBOR annotated programmatic JSON) may still refer to "virtual" entity types due to their backwards-compatibility requirements. ## Scrypto v1.3.1 URL: https://radix.wiki/developers/legacy-docs/updates/release-notes/scrypto/scrypto-v1-3-1 Updated: 2026-02-18 Summary: Until now, Scrypto packages had to be compiled with Rust 1.81.0 or older. Newer Rust versions emit WASM binaries that can include non-MVP WebAssembly features, Updates > Release Notes > Scrypto > Scrypto v1.3.1 — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/scrypto-v1-3-1.md) Support For New Rust Compilers Why This Was Needed Until now, Scrypto packages had to be compiled with Rust 1.81.0 or older. Newer Rust versions emit WASM binaries that can include non-MVP WebAssembly features, most notably instructions from the bulk memory proposal—which the Radix Engine does not permit. A natural idea is to force the compiler to stick to MVP WASM by using -Ctarget-cpu=mvp. Unfortunately, that only applies to your crate’s code. The prebuilt Rust standard library distributed for the wasm32-unknown-unknown target may still include non-MVP instructions, and your package is linked against that standard library. The end result is a WASM artifact that can still contain instructions rejected by the Radix Engine, due to the standard library. An extensive write-up on this issue can be found here (https://github.com/radixdlt/radixdlt-scrypto/pull/2079) . With this Scrypto release, you can use Rust versions newer than 1.81.0. We’ve tested these changes with Rust 1.92.0, and in principle any newer compiler version should work as well. What Changed We updated the Scrypto build pipeline so the Rust standard library is rebuilt as part of every Scrypto package build. This gives us full control over which WASM features appear in the final artifact, including those originating from the standard library. The pipeline changes are: - RUSTC_BOOTSTRAP=1 is set when compiling Scrypto packages. This allows rebuilding the Rust standard library using the stable compiler the user has installed (more information here (https://doc.rust-lang.org/beta/unstable-book/compiler-environment-variables/RUSTC_BOOTSTRAP.html) ). - The Scrypto compiler now sets CARGO_ENCODED_RUSTFLAGS, RUSTFLAGS, and CARGO_TARGET_WASM32_UNKNOWN_UNKNOWN_RUSTFLAGS to -Ctarget-cpu=mvp -Ctarget-feature=+mutable-globals,+sign-ext -Zunstable-options -Cpanic=abort. This forces the final WASM artifact to use the MVP feature set, while explicitly permitting mutable-globals and sign-ext. - The Scrypto compiler now sets CFLAGS_wasm32_unknown_unknown to -mcpu=mvp -mmutable-globals -msign-ext. This is especially important for packages that depend on crates containing C code (for example minicov), ensuring clang does not generate WASM instructions we don’t permit. - The Scrypto compiler now instructs Rust to rebuild the standard library by passing -Zbuild-std=std,panic_abort and -Zbuild-std-features=optimize_for_size. What This Means For You You should now be able to use Rust versions above 1.81.0, including the Rust 2024 edition, without hitting publish-time failures due to rejected WASM instructions. The Scrypto codebase currently uses Rust 1.92.0 in CI, toolchain files, and in newly generated packages. One important note: if you use a custom build pipeline (i.e., you do not use scrypto build or the ScryptoCompiler), this is a breaking change for you. Building a valid Scrypto WASM is no longer equivalent to a simple cargo build --release --target wasm32-unknown-unknown. In most cases, you must also ensure the output uses MVP WASM features (plus the small set of features we consider safe), which typically requires rebuilding the Rust standard library for the wasm32-unknown-unknown target as part of the build. New Manifest SBOR Types Why This Was Needed To understand the need for this we need to take a deep dive into how the data flows throughout the Radix Engine. The following is a rough state machine of the various states that the data can exist in and how various data transformations are performed. Let’s now say that you have the following argument in a manifest, which is equivalent to a rule!(require(named_address)) which is a an access rule that requires a named address: Enum<2u8>(     Enum<0u8>(         Enum<0u8>(             Enum<1u8>(                 NamedAddress("some\_address")             )         )     ) ) When this data is flowing through the Radix Engine it takes the following path through the above state machine: The data starts out in the textual transaction manifest representation. The manifest compiler then compiles it into a ManifestValue and eventually that makes its way to the transaction processor. Not all value types in the Manifest SBOR codec are compatible with the Scrypto SBOR codec. For example, Scrypto SBOR doesn’t have a concept of a NamedAddress, only static addresses, but the Manifest SBOR codec does. This means that translating a ManifestValue into a ScryptoValue can’t be done without the use of some context. A NamedAddress requires the context of the transaction to understand which address reservation it’s linked to and what static address was allocated as a result of the allocation instruction. The transaction processor tracks the transaction’s context and is therefore capable of taking a named address (or other types that require conversion) and converting it into the appropriate ScryptoValue. Once the ManifestValue has been converted into a ScryptoValue by the transaction processor it can then be SBOR decoded into its appropriate Rust type, which is an AccessRule in this case. This is the core reason as to why an AccessRule that contains a NamedAddress can be used in a transaction, and that transaction can be executed successfully, despite the AccessRule models not supporting named addresses. However, there are certain cases where the data that we have might not take the same flow as described above. Let’s use the manifest’s static analyzer as an example which decodes all of the native blueprint invocations found in the manifest to analyze the manifest and determine its static validity. For the same example textual manifest input provided above, the state machine diagram for it would look like the following: The data would start out in its textual manifest representation and will then be compiled by the manifest compiler into a ManifestValue which is fed into the static analyzer. The static analyzer would then attempt to decode this ManifestValue into its appropriate Rust type, which is an AccessRule in this case, but fail since the AccessRule model doesn’t support NamedAddresses. The static analyzer would then, incorrectly, flag this manifest as being statically invalid despite the fact that if it were submitted to the ledger it would execute successfully. The core of this issue is that certain native blueprint invocation types implemented the Manifest SBOR codec but they didn’t use models that could express the full range of data types that can be expressed in that codec, and therefore, the decoding fails in the static analyzer, but doesn’t fail when it goes through the transaction processor. As evident by the description of this issue, the impact of it can’t be felt on-ledger and is only felt off-ledger by static analyzers and tools that attempt to process the invocations found in manifests, like the Radix Engine Toolkit or the wallet for example. What Changed We introduced new models that implement the Manifest SBOR codec and changed all of the Manifest SBOR invocation types of the native blueprints to use these models. These models have the following properties: - They’re opaque at the language layer (in Rust), this means that for these new types you can’t inspect at the data that they contain. - They’re fully typed in the SBOR layer, this means that they implement the same SBOR schema as the Scrypto SBOR types that they mirror. - They’re completely immutable. The list of new Manifest SBOR types added is: - ManifestPackageDefinition - ManifestBasicRequirement - ManifestCompositeRequirement - ManifestAccessRule - ManifestOwnerRole - ManifestMetadataValue As we’ve already mentioned, these types are completely opaque at the language layer. Therefore, we have implemented the following to make it easier to work with these types, we will use AccessRule as an example here, but this applies to all of the types above: - An AccessRule can be converted into a ManifestAccessRule through the ManifestAccessRule::new function which takes in an AccessRule or through Rust’s From trait. This is useful when building manifests if the user wants to continue using the old models and do the conversion into the new models at the end. - A ManifestAccessRule can be converted back into an AccessRule through the ManifestAccessRule::try_into_typed method or through Rust’s TryFrom trait. As the name of the method suggests, this is a fallible conversion which would fail if the contained data can’t be converted into the AccessRule type. This is useful when analyzing manifests and wanting to peek into these opaque types. With these new added models, we changed a number of the invocation types to use the correct Manifest SBOR models. The changes are as follows: Blueprint Name Struct Name Field Name Old Type New Type AccessController AccessControllerCreateManifestInput rule_set RuleSet ManifestRuleSet ​ AccessControllerInitiateRecoveryAsPrimaryManifestInput rule_set RuleSet ManifestRuleSet ​ AccessControllerInitiateRecoveryAsRecoveryManifestInput rule_set RuleSet ManifestRuleSet ​ AccessControllerQuickConfirmPrimaryRoleRecoveryProposalManifestInput rule_set RuleSet ManifestRuleSet ​ AccessControllerQuickConfirmRecoveryRoleRecoveryProposalManifestInput rule_set RuleSet ManifestRuleSet ​ AccessControllerTimedConfirmRecoveryManifestInput rule_set RuleSet ManifestRuleSet ​ AccessControllerStopTimedRecoveryManifestInput rule_set RuleSet ManifestRuleSet Account AccountCreateAdvancedManifestInput owner_role OwnerRole ManifestOwnerRole Identity IdentityCreateAdvancedManifestInput owner_role OwnerRole ManifestOwnerRole AccountLocker AccountLockerInstantiateManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ storer_role AccessRule ManifestAccessRule ​ ​ storer_updater_role AccessRule ManifestAccessRule ​ ​ recoverer_role AccessRule ManifestAccessRule ​ ​ recoverer_updater_role AccessRule ManifestAccessRule Package PackagePublishWasmManifestInput definition PackageDefinition ManifestPackageDefinition ​ PackagePublishWasmAdvancedManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ definition PackageDefinition ManifestPackageDefinition ​ PackagePublishNativeManifestInput definition PackageDefinition ManifestPackageDefinition OneResourcePool OneResourcePoolInstantiateManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ pool_manager AccessRule ManifestAccessRule TwoResourcePool TwoResourcePoolInstantiateManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ pool_manager AccessRule ManifestAccessRule MultiResourcePool MultiResourcePoolInstantiateManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ pool_manager AccessRule ManifestAccessRule FungibleResourceManager FungibleResourceManagerCreateManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ resource_roles FungibleResourceRoles ManifestFungibleResourceRoles ​ FungibleResourceManagerCreateWithInitialSupplyManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ resource_roles FungibleResourceRoles ManifestFungibleResourceRoles NonFungibleResourceManager NonFungibleResourceManagerCreateManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ resource_roles NonFungibleResourceRoles ManifestNonFungibleResourceRoles ​ NonFungibleResourceManagerCreateWithInitialSupplyManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ resource_roles NonFungibleResourceRoles ManifestNonFungibleResourceRoles ​ NonFungibleResourceManagerCreateRuidWithInitialSupplyManifestInput owner_role OwnerRole ManifestOwnerRole ​ ​ resource_roles NonFungibleResourceRoles ManifestNonFungibleResourceRoles Metadata MetadataSetManifestInput value MetadataValue ManifestMetadataValue RoleAssignment RoleAssignmentCreateManifestInput owner_role OwnerRoleEntry ManifestOwnerRoleEntry ​ RoleAssignmentSetManifestInput rule AccessRule ManifestAccessRule ​ RoleAssignmentSetOwnerManifestInput rule AccessRule ManifestAccessRule Some of the manifest invocation types for the native blueprints have had their ManifestSBOR trait implementations removed with only ManifestEncode and ManifestCategorize kept. This only applies to types that can’t capture the full range of values that the Manifest SBOR codec allows for. What This Means For You If you’re a user of the ManifestBuilder or if you author your manifests in the textual formats then these changes do not affect you in any way and you should be able to build your manifests in the same way as you did before without much changes. If you made use of the *ManifestInput types then you would be affected by this change and would notice that some of the fields on your *ManifestInput structs have incorrect types. This can mostly be solved by calling .into() on these fields since, as mentioned above, these new opaque types allow for conversions through Rust’s From trait. Miscellaneous Changes - Fixed an issue where packages wouldn’t compile due to an issue with CMAKE. - Improved the Scrypto logging macros to work better with IDEs. - Added support for non-fungible resources in the receipt’s execution trace. - Changed the separator used in SBOR’s categorize_types attribute to ;. - Fixed bugs with the TypedManifestNativeInvocation type that caused decoding to fail in cases when it should not. - Added more native invocations to the TypedManifestNativeInvocation to ensure that it supports all of the invocations on the native blueprints. ## Scrypto URL: https://radix.wiki/developers/legacy-docs/updates/release-notes/scrypto/scrypto Updated: 2026-02-18 Summary: Legacy documentation: Scrypto Updates > Release Notes > Scrypto — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/README.md) ## Release Notes URL: https://radix.wiki/developers/legacy-docs/updates/release-notes/release-notes Updated: 2026-02-18 Summary: Legacy documentation: Release Notes Updates > Release Notes — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/README.md) ## Backlog URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/developer-tools/developer-tools-backlog Updated: 2026-02-18 Summary: Items within a particular category are not sorted by priority! Updates > Roadmap > Developer Tools > Backlog — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/developer-tools/developer-tools-backlog.md) - Higher Priority contains items which are most likely to be tackled next - General Items contains items which have been reviewed and are slated to be implemented - Acknowledged contains items which either have not been reviewed, or have a lower priority Items within a particular category are not sorted by priority! General Items - dApp Toolkit: Support for avatar feature of wallet personas - dApp Toolkit: Better abstractions for accounts and resources - dApp Toolkit: Improved caching - Radix Engine Toolkit (RET): Add Long Term Support (LTS) section to UniFFI wrappers with simplified operations for common needs, similar to TypeScript RET LTS - RET: Add C++ and Go wrappers (dependent upon UniFFI generators which are in development by third parties) - RET: Utilities for message encryption - RET: Recognize additional transaction patterns for wallet consumers Acknowledged - RET: Look into options for reducing size of WASM included in TypeScript RET, or alternatives - RET: Unify TypeScript RET interface behavior with other language wrappers ## Underway URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/developer-tools/developer-tools-underway Updated: 2026-02-18 Summary: This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Updates > Roadmap > Developer Tools > Underway — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/developer-tools/developer-tools-underway.md) This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Many business-as-usual tasks, like testing, fixes, research & design, and small features/optimizations don't rise to the level of being included here. Items within sections are not ordered, and no timeline should be assumed for any particular item unless it has been explicitly communicated. - Radix Engine Toolkit (RET): Update documentation for main repo as well as all UniFFI wrappers, including usage examples - RET: Publishing Kotlin & Android packages to Maven Central - RET: Additional manifest builder methods targeted at native blueprints/components ## Developer Tools URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/developer-tools/developer-tools Updated: 2026-02-18 Summary: Legacy documentation: Developer Tools Updates > Roadmap > Developer Tools — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/developer-tools/README.md) ## Complete URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/gateway/gateway-complete Updated: 2026-02-18 Summary: Legacy documentation: Complete Updates > Roadmap > Gateway > Complete — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/gateway/gateway-complete.md) Post Bottlenose - new features - Bidirectional Metadata Link Validation - Who currently holds most of resource X by amount DESC (current state only) - Native resource details - Exposing iteration over a KeyValueStore and native blueprint partitions at the current state version. - Effective fee change calculation on validator list Bottlenose support - Support for account lockers Rework of transaction results and filtering - More consistent and complete handling of transaction classification and results between Preview and Post-commit. - Improved transaction filtering - eg allow querying transaction/stream by event emitter for all event types. ## Backlog URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/gateway/gateway-backlog Updated: 2026-02-18 Summary: Items within a particular category are not sorted by priority! Updates > Roadmap > Gateway > Backlog — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/gateway/gateway-backlog.md) - Higher Priority contains items which are most likely to be tackled next - General Items contains items which have been reviewed and are slated to be implemented - Acknowledged contains items which either have not been reviewed, or have a lower priority Items within a particular category are not sorted by priority! Higher Priority - Implicit Requirements (e.g. Global caller) index - to look up a global caller / public key etc from its hash General Items - Improved tracing through the Gateway service to improve debugging The backlog will be expanded once Gateway discovery completes. Acknowledged - Aggregator failover - we recommend running parallel Gateways if this is a concern. - Exposing blueprint / method details - this is currently handled better by the Engine State API. ## Underway URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/gateway/gateway-underway Updated: 2026-02-18 Summary: This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Updates > Roadmap > Gateway > Underway — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/gateway/gateway-underway.md) This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Many business-as-usual tasks, like testing, fixes, research & design, and small features/optimizations don't rise to the level of being included here. Items within sections are not ordered, and no timeline should be assumed for any particular item unless it has been explicitly communicated. - 1.11.0 - Improve Event Details - Create new detailed events, to aid creation of other aggregation tools. - Add resource_change filter to the transaction stream - A more useful filter than the manifest resources filter. - 1.11.0 - Further collection storage reworks - Ensure that aggregator overhead is constant. Working through the separate collections in the Gateway. - JSON receipt storage investigation - Further looking at how we can decrease the size of the ledger transactions table. ## Milestones URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/gateway/gateway-milestones Updated: 2026-02-18 Summary: Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Updates > Roadmap > Gateway > Milestones — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/gateway/gateway-milestones.md) Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Items on this page are ordered in expected delivery sequence. However, just because a milestone is lower on the list doesn't mean that work isn't progressing on it. Foundational work for future milestones in the form of research, design, prototyping, or dependency implementation proceeds in parallel with progress on the current milestone. Gateway Consolidation Our top focus at the moment is on improving the worst-case performance of the Gateway and Aggregator: - Improving how we store collections historically, and revising how these are returned from the API. - The new ordering will be “by first appearance of collection key on ledger ASC/DESC” Pending Transaction Rework Currently, the Gateway stores submitted (but not yet committed) transactions in its main database. It uses the stored information to provide status tracking for submitted transactions, and to enable the Gateway to resubmit the transaction on your behalf if it encounters difficulties or the network is temporarily too busy. Storing submitted transactions in its main database was a simple choice originally, but it puts constraints on how the service is deployed. The resubmission is also currently handled by the Data Aggregator, which should really be focused solely on ingesting ledger updates. Therefore our first milestone looks to address this, by moving pending transaction handling to a separate service and data store. This will result in better performance, improved resilience, and open up new possibilities for Gateway deployment. We expect this change to be "behind-the-scenes", and won't result in substantial changes to the existing API surface.  Future of Gateway Discovery The Gateway is serving the needs of many different stakeholders. These create competing constraints on the design and architecture of the Gateway, to the extent where the current set-up doesn't serve any individual use case as well as it could. To address this, we believe it is time to look at how the Gateway and Radix stack could be evolved to better meet the current and future needs of these groups. We are referring to this process by the rather grand title of the "Gateway Discovery". This process will lead us to exploring potential new designs and architectures, including: - Splitting the Gateway into more focused indexing services and APIs - Storing the data in services other than PostgreSQL - Reworking the abstractions and encapsulation of committed transactions, receipts and related information As this process progresses, we will be sharing updates with the community, and may consider surveying the community on key areas not covered by our previous surveys. Regardless of outcome, we will ensure any plans/directions will be sign-posted well ahead of delivery, to allow integrators time to prepare and adapt if necessary. The existing Gateway will continue to be supported in a backwards compatible manner throughout this process, at least until suitable replacements are available, and there has been ample time for integrators to adapt to those replacements.  We're really looking forward to the opportunities this presents for building better tools to integrators. Future milestones Will depend on the outcome of the Discovery process. ## Gateway URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/gateway/gateway Updated: 2026-02-18 Summary: Legacy documentation: Gateway Updates > Roadmap > Gateway — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/gateway/README.md) ## Complete URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/node-engine/node-engine-complete Updated: 2026-02-18 Summary: Designing and implementing new APIs for reading entity state, aligning with the "system" layer in the engine. Updates > Roadmap > Node/Engine > Complete — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/node-engine/node-engine-complete.md) "Cuttlefish" coordinated protocol update - Support for subintents (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/pre-authorizations-and-subintents.md) - Additional native crypto utils for signature validation within Scrypto - Getters for Account balances - Throughput improvements "Bottlenose" coordinated protocol update - New AccountLocker native blueprint, as described here (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/locker.md) . - New API for reading component owner role from Scrypto - New substates that expose the current protocol-related parameters - Add recovery fee vault to AccessController, removing the need for third-party fee locking during the recovery process - Various improvements to Account and TransactionProcessor native blueprints Generic State APIs Designing and implementing new APIs for reading entity state, aligning with the "system" layer in the engine. API performance improvements Improving the DB locking used in the node with the APIs to improve throughput. Tooling/testing refactor & optimizations (first round) Cleaning up and optimizing various aspects of the test suites, fuzzer, and benchmarking tools, and streamlining the continuous integration process for faster builds. Housekeeping work to speed up future development. "Anemone" Coordinated Protocol Update - Support for coordinated protocol updates with the Babylon engine - Correct the creation cost for validators to 100 USD - Proof GC - decreases storage usage for validators - Tweaked the Pool native blueprints to improve precision, and improved behavior with non-18 divisibility resources. - Add basic BLS support to Scrypto - Allow requesting TimePrecision::Second when requesting the current time in Scrypto Coordinated Protocol Updates Enables validators to signal readiness for a proposed protocol update, and prevent the network from enacting an update until a certain percentage of stake weight has signaled readiness. This allows for quicker uptake of updates without the risk of leaving too much of the validator set behind. Runtime Native Pre-compiles Certain work intensive operations are impractical to execute in WASM due to excessive runtime costs. An expandable native VM supporting packages where important but "worky" operations can be implemented in-engine in Rust, but callable from Scrypto, enables speedy & low-cost execution of these operations. ## Backlog URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/node-engine/node-engine-backlog Updated: 2026-02-18 Summary: Items within a particular category are not sorted by priority! Updates > Roadmap > Node/Engine > Backlog — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/node-engine/node-engine-backlog.md) - Higher Priority contains items which are most likely to be tackled next - General Items contains items which have been reviewed and are slated to be implemented - Acknowledged contains items which either have not been reviewed, or have a lower priority Items within a particular category are not sorted by priority! Higher Priority - Updating config for WASM validation so that it can accept Rust code compiled under nightly Rust 1.82+. - Performance improvements for Scrypto code instantiation General Items - Better error messages from transaction failures - Liveness Jailing - New sync options to allow faster syncing time - Support for tailored configuration of RocksDB  (possibly via a `.ini` file)  - Throughput improvements - e.g., Mempool Sync, Engine Optimizations - Complete design for rewrite of Java Node into Rust - Support for periodic/subscription payments in native Accounts Acknowledged ## Underway URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/node-engine/node-engine-underway Updated: 2026-02-18 Summary: This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Updates > Roadmap > Node/Engine > Underway — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/node-engine/node-engine-underway.md) This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Many business-as-usual tasks, like testing, fixes, research & design, and small features/optimizations don't rise to the level of being included here. Items within sections are not ordered, and no timeline should be assumed for any particular item unless it has been explicitly communicated. - Scoping for the " Dugong (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/dugong.md) " coordinated protocol update ## Milestones URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/node-engine/node-engine-milestones Updated: 2026-02-18 Summary: Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Updates > Roadmap > Node/Engine > Milestones — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/node-engine/node-engine-milestones.md) Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Items on this page are ordered in expected delivery sequence. However, just because a milestone is lower on the list doesn't mean that work isn't progressing on it. Foundational work for future milestones in the form of research, design, prototyping, or dependency implementation proceeds in parallel with progress on the current milestone. Liveness Jailing Introduce automated mechanisms to temporarily remove validators from the validator set if they consistently fail to meet their network obligations. The system is designed such that well-run nodes which experience unusual problems can quickly get back to validating once their difficulties are resolved, but nodes which repeatedly fail to perform after re-entering the validator set will find themselves jailed for longer and longer periods. Node Rust Rewrite The Node code base is currently written in Java, while the Radix Engine is implemented in Rust. This milestone will unify both to Rust. This move will be accompanied by a host of other improvements and optimizations, to be detailed as work progresses. Large Throughput Optimizations There are several areas in different parts of the stack that have room for optimization, and can be individually addressed based as appropriate at any point in time. This particular milestone is concerned with larger, structural optimizations that can make a significant impact on overall throughput, as network demand rises and it becomes desired.  This work is also a necessary step on the road to the Xi'an release. Xi'an Network Upgrade Xi'an is the next major upgrade to the Radix network.  Xi'an will implement the Cerberus consensus algorithm, enabling linear scalability without sacrificing atomic composability. This is an extraordinarily deep topic with a host of related content. You can get started by reading the original whitepaper (https://radixdlt.com/whitepapers/consensus) on Cerberus, or jump straight to the peer-reviewed whitepaper (http://radixdlt.com/whitepapers/peerreview) . ## Node/Engine URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/node-engine/node-engine Updated: 2026-02-18 Summary: Legacy documentation: Node/Engine Updates > Roadmap > Node/Engine — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/node-engine/README.md) ## Backlog URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/scrypto/scrypto-backlog Updated: 2026-02-18 Summary: Items within a particular category are not sorted by priority! Updates > Roadmap > Scrypto > Backlog — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/scrypto/scrypto-backlog.md) - Higher Priority contains items which are most likely to be tackled next - General Items contains items which have been reviewed and are slated to be implemented - Acknowledged contains items which either have not been reviewed, or have a lower priority Items within a particular category are not sorted by priority! Higher Priority - Auto-generate of manifest models for scrypto components - Traits in Scrypto - Improved reporting of error location - Manifest support for dynamic resource and package addresses - SBOR performance improvements - Implementation of Allowances feature (https://www.radixdlt.com/blog/feedback-wanted-allowances-design) General Items - Add methods for checking Proofs directly against roles and rules - Reduce AuthZone overhead during execution - In-transaction transient state store for handling data which does not need to be persisted - Recognize and warn on dangling buckets at compile time - Support generics in SBOR schema Acknowledged ## Underway URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/scrypto/scrypto-underway Updated: 2026-02-18 Summary: This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Updates > Roadmap > Scrypto > Underway — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/scrypto/scrypto-underway.md) This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Many business-as-usual tasks, like testing, fixes, research & design, and small features/optimizations don't rise to the level of being included here. Items within sections are not ordered, and no timeline should be assumed for any particular item unless it has been explicitly communicated. - Scoping for 1.4.0 release along with Dugong (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/dugong.md) ## Milestones URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/scrypto/scrypto-milestones Updated: 2026-02-18 Summary: Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Updates > Roadmap > Scrypto > Milestones — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/scrypto/scrypto-milestones.md) Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Items on this page are ordered in expected delivery sequence. However, just because a milestone is lower on the list doesn't mean that work isn't progressing on it. Foundational work for future milestones in the form of research, design, prototyping, or dependency implementation proceeds in parallel with progress on the current milestone. Allowances The primary authorization pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/README.md) of Radix is based around the use of badges to demonstrate the ability that an actor is permitted to perform a certain action. To avoid abuse and unintended privilege escalation, badges can not be "re-used" by anything except what the actor is explicitly presenting them to. However, there are certain scenarios under which it makes sense for an actor to enable a particular action within a transaction, without caring to know who actually performs the action.  This will be supported by the creation of allowances, which are limited-use badges usable by anything within the transaction. For example, if a creator of an NFT wants to charge an amount every time someone wishes to transfer one of their tokens, they can set the withdrawal rules of their non-fungible resource to require a particular badge to be present in order to withdraw. They can then place the badge permitting the withdrawal into a component which charges 10 ZOMBO to produce an allowance which permits a single withdrawal of a single token. In order to transfer the NFT, the user's transaction must pass the component 10 ZOMBO in order to then be able to withdraw it and send it on to someone else. Nested/Grouped NFTs Enable the ability for NFTs to contain other NFTs, or be linked such that they travel together. This enables an enormous host of interesting use cases, but requires careful design to avoid undesirable use patterns (for example, avoiding the ability to circumvent withdrawal rules by placing a restricted NFT into an unrestricted "container" NFT). Design thinking will be shared with the community for feedback as it develops. Upgradeable Blueprints Upgradability of smart contracts has been a thorny problem since the dawn of Ethereum, and this milestone is about enabling upgradability which allows creators to select what level of upgradability they desire, while informing potential consumers about what things the creator has the ability to change. To avoid any chance of unwelcome surprises for consumers, this work will not result in already-deployed blueprints and components becoming retroactively upgradeable. ## Scrypto URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/scrypto/scrypto Updated: 2026-02-18 Summary: Legacy documentation: Scrypto Updates > Roadmap > Scrypto — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/scrypto/README.md) ## Complete URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/wallets/complete Updated: 2026-02-18 Summary: Legacy documentation: Complete Updates > Roadmap > Wallets > Complete — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/wallets/complete.md) - Account deletion - Removes assets from account (if desired), sets account to disallow any further deposits, and permanently eliminates possibility of using the account - **Support for pre-authorization requests ** - A new option for transaction construction with user signing and returning a portion of a complete transaction - Enables “intent”-based swaps required by institutional applications, as well as other use cases like more flexible delegated fee payment and more - Claim airdrops via trusted Lockers - Check for Lockers associated with dApps the user has logged into - Notify users when they have a deposit waiting in a trusted Locker, on the relevant account(s) - Let the user directly claim the deposit in the wallet UI - Option to disable claim notifications for specific dApps abusing the feature - Enable new ROLA proof options for dApps - One-time “re-login” style request to proof ownership of a specific Persona - One-time request to proof ownership of specific accounts - SVG image support on iOS (and as addition to metadata standard) - Option to require device auth (Biometrics/PIN) check upon wallet foregrounding - When transferring, let user know when recipient account's deposit rules will block deposit - Hide/show specific assets (perhaps "nuisance assets") - Show USD value of transaction fees on Transaction Review screen - Individually unhide Accounts/Personas - Info bubble pop-ups throughout the app - New address info screen - Tap on any address to bring up useful info panel - View the address in full, copy it, see a QR code, or verify it on Ledger device (where relevant) - New home screen onboarding "carousel" - Offer shortcuts for smoother onboard experience - Resource metadata improvements - Show full set of metadata key/value fields assigned to a resource - Show which resource metadata fields are locked - Radix Connect for Mobile - Enable use of dApp websites running on any mobile browser - Transparent for dApp developer - Radix dApp Toolkit automatically handles connecting to the Radix Wallet on either desktop or mobile - Lays the groundwork for support of native mobile app connections, and dApp-specific Connector-style "linking" - Completely rebuilt wallet backup system and UI for iOS and Android - Throw out OS-native “backup” systems, which are not reliable - Use of more direct iCloud and Google Drive data upload mechanisms that we can control, and have clear visibility of failures - Completely rebuilt UI for security setup and warning messages for better user understanding, including new Security Center that includes both security factors and backups in one place. Will further expand with MFA later. - Support linking multiple wallets with one Radix Wallet Connector browser extension - Updated Connector extension logic for automatic routing - Ability for Connector extension to display accounts list - Connector requests access to full account list from linked wallet - Shows full list in Connector extension window for easy reference and copy-pasting on desktop - In-wallet transaction history view - View transactions across time for a given account, shown by assets in/out of account and transaction type - Filter transactions shown by type, resource, and more - Tap a transaction to view more details about it on Dashboard - Display USD value of holdings - Backed by pricing service using Radix Charts for data - Additional transaction review summary types - Stake / request unstake / claim transaction types - General tranasction display style design updates - Dedicated network staking asset tab - Separate LSU and claim NFT assets from pool units - Per-account summaries of current staked/unstaking/claimable XRD in the staking tab - Account hiding - Behaves like deletion but remains recoverable - deletion on-ledger is not possible - Optimize “chattiness” of wallets - Better caching and bundling of requests to avoid excess communication with the Gateway - Responsiveness improvements - Quicker load times for wallets with lots of accounts  - Quicker load times for wallets with "resource heavy" accounts - Improve performance of fee calculation time - Load transaction review summaries immediately while waiting for fee payment information - Quicker load times for large collections of NFTs - Failure screen for failed transaction - Improved transaction error messages - Support for animated GIFs in resource images - Temporary transaction "History" handling - Link from Account view out to new Account transaction history page on Dashboard with basic balance information - Proper in-app History—including filtering and sorting and full transaction-type-based summary information—is still to come - No-backup-available recovery option - Recover from bare seed phrases or Ledger hardware wallets - Reconstruct derivation indices for each seed phrase or Ledger device - Supports both Babylon- and Olympia-created accounts - Also enables Olympia importing without usage of Olympia Desktop Wallet (for those who can accept not carrying over current account names or known list of account derivations) - Allow recovery from backup for users without Babylon seed phrase - Particularly for users who only use Ledger accounts, or haven't created any Babylon software accounts yet that they care about - Show full non-fungible data on NFT details view - Smarter transfer flow deposits (eliminate "extra" signatures when possible) - Check user-controlled deposit accounts for deposit rules, use `try_deposit` if the resource to be deposited would not actually be rejected - use `deposit` and surface "signature required" message when it would - Check non-user-controlled deposit accounts for deposit rules, surface warning when deposit rules will prevent the deposit - Improved verification that user has written down seed phrase ## Backlog URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/wallets/wallets-backlog Updated: 2026-02-18 Summary: Items within a particular category are not sorted by priority! Updates > Roadmap > Wallets > Backlog — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/wallets/wallets-backlog.md) - Higher Priority contains items which are most likely to be tackled next - General Items contains items which have been reviewed and are slated to be implemented - Acknowledged contains items which either have not been reviewed, or have a lower priority Items within a particular category are not sorted by priority! Higher Priority QOL Enhancements - Indication on dApp card (transaction review screen) if it is currently not on authorized dApps list - Copying and zooming of NFT images User Features - Redeem pool units directly from within wallet - Unstake directly from within wallet - Name service provider support - Add a name service provider in the wallet, and do lookups of names in place of account addresses - Dark mode - Persona Avatars - Both uploaded and NFT-based - “Share all” option for account sharing - Always shares the full list of all accounts (not just the current list of all accounts). Would actually be a specific profile entry. - Encrypted message support - Account details update - Viewing and selecting of relevant resource tags - View by type, dApp, tag Developer Features - Full Persona data implementation - Greatly expanded persona data fields (https://www.radixdlt.com/blog/feedback-wanted-radix-persona-data-types) General Items QOL Enhancements - Homescreen enhancements - User-specified re-ordering of accounts on homescreen - Security prompt to user on first address copy of an account: - "If you're about to send tokens to this account, you should consider writing down the seed phrase securing it. Do you want to do that now?" - Indication on account cards if an account is configured to not allow third party deposits - Transfer flow enhancements - When selecting resources, make resources tappable to bring up their details - as it does in the Account view (helps differentiate between tokens with the same name) - Allow selection of a full "collection" of NFTs to send - Allow amount of pool units to send to be picked by percentage (showing redeemable value) - Transaction Review enhancements - Indication on asset card (specifically in transaction review) if it is unknown to the wallet - Check if any withdraws or badge proofs from user’s accounts will be invalid based on current state (ie. “you don’t actually have enough of X to do that”) - Should mostly only happen if a dApp is lazy, but avoids user spending fees for nothing - or having a transaction that is “rejected” but is still out there in the mempool and has to be explicitly canceled. - In fee customization, show XRD balances shown in accounts when picking fee payment account - Check for and warn about unusually high fees / royalties - Access to view non-resource parameters passed to components called in manifest - Transparency for users for dApps where assets aren't the primary function (data access for example) -  Connector Extension / Radix Connect enhancements - User switching between signaling servers (similar to gateway server switching) - Expose similar option in CE - User-specified option for Radix Connect to never fall back to relay server - Support for Wallet push notifications and async communicaction - dApp usage enhancements - Have RDT send an explicit “cancel” request to the wallet - for a given request ID - if the user cancels that request on the dApp side - Just friendly to not have the user have to manually also cancel on the dApp side, even if no bad behavior can result from not doing this - General display enhancements - Comma/period use in numbers on iOS, from OS setting - Other QOL - Add link from account settings to Dashboard page for the account (once Dashboard can show more helpful transaction history) - Ledger device hiding/"deleting" - Have Connector extension linking QR code act as a deep link to go straight to linking flow in wallet - Initiate transfer from asset detail screen (within an account) - Wallet/extension version checking and update prompting - Offer option to set third-party deposit rule to “deny all” when an Account is hidden User Features - Enable applying guarantees of specific nonfungible IDs - Resource safety locking - prevent casual/accidental transfer of certain resources - Related: option (default on?) to warn severely against transfer of assets with “badge” tag - gives appropriate meaning to that tag, behavior can be part of metadata standard’s wallet handling to encourage use - Alias services support, for things like "domain" services that provide a name for an address - Address Book - User-added and managed aliases - Overlay with Name Service support - Handling of multiple in-flight transactions - Support cancelation for things stuck in mempool (when we have cancelation on ledger) - Time-delayed submission to preserve account ownership separation - Easily see recent successes/failures - “Check recovery” feature - Run the user through using their recovery/confirmation factors so they can be sure everything is good if something goes wrong. - Maybe even occasionally prompt to consider doing that check. - Support for nested/contained NFT display and manipulation - Requires Radix Engine support Developer Features - “No-modify” transaction manifest support - dApp must specify the signatures it needs directly, and the Transaction Review screen has special warnings, and does not modify the transaction at all (including fees). - Need to consider this in the context of MFA and external signer capability - New dApp request: ROLA proof of a specific resource/nonfungible - Useful when a frontend wants to gate access by badge ownership, much more direct than just requesting accounts and looking through them. Also useful in an “ongoing” sense, allowing the resource to even move around between accounts. - Wallet finds an account that contains it, provides ROLA proof of ownership of that account - Wallet-side storage of dApp data - Cookie style - Invoicing functionality - Requests using specific manifest templates partially specifying results, but also prompting wallet to fill in certain required information (eg. a source account, token quantity, ) Enables things like one-step payments without getting a list of accounts first from the ultimate signer - dApp-to-wallet (via Wallet API) - wallet-to-wallet or device-to-wallet (via QR) - Document signing/notarization - PDF/text legal agreements - Readable message payloads that grant off-ledger permissions to dApps Acknowledged - Android wallet on F-Droid or similar - Interesting, but needs consideration/review of non-technical factors - More factor types - Relevant after MFA implementation - Additional hardware devices? - Second phone, tablet, watch? - Third party custodial signer? - An asset I hold? (transferable?) - Use of existing credit card NFC? - More intelligent security configuration prompting - Relevant after MFA implementation - Prompt advanced security setups for high-value accounts, or in response to questions about user risk profile? - Real Account/Persona deletion - Set "tombstone" state on-ledger to no longer accept deposits/withdrawals, remove public keys, etc. - Fully remove from wallet (not just hide) - Language localization to top priority 1-3 non-English languages - Multi-device syncing - ie. single Profile synced across devices - “Plausible deniability” mode - Secretly put wallet in a mode certain accounts are hidden from view and are inaccessible, but other (ostensibly low-value) accounts remain visible - Passport credential scanning via NFC - Storage as Persona data field that can be requested as proof of human: ePassport Basics from ICAO, summary blog ## Underway URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/wallets/wallets-underway Updated: 2026-02-18 Summary: This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Updates > Roadmap > Wallets > Underway — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/wallets/wallets-underway.md) This page is a high-level summary of what is currently "top-of-mind" in this area for the development team. Many business-as-usual tasks, like testing, fixes, research & design, and small features/optimizations don't rise to the level of being included here. Items within sections are not ordered, and no timeline should be assumed for any particular item unless it has been explicitly communicated. - First MFA implementation for release - “Security Shields” that can be applied to accounts and personas - Updated Security Center including Security Shield status, and updated factor addition/configuration - Specifying of “default” Security Shield, with ability to apply a specific Shield to single accounts/personas if desired - Additional factor options: Arculus cards, passwords, passphrases (BIP39 seed phrase you must type in for each signature) - Security Shield configuration wizard, ensuring minimum standards of security and recoverability and suggesting good Shield configurations from available factors while allowing user to fully customize factors used to sign, start recover, and confirm recovery - Support for detecting a time-delayed recovery, highlighting it to users, allowing user to cancel or confirm (once delay is complete) - Automated signing flow for accounts controlled by arbitrary combinations of MFA factors to complete signing with minimum number of factors, but ensuring user can skip some factors and still complete signing with any valid combination - User-friendly migration flows to update Security Shields if a phone is lost or user gets new phone (and the user has chosen to use, including ability for a new phone to request a signature from a previous device via QR code - Allow use of more factor types to create new accounts or personas in single-factor mode - Batched transaction submission with randomized delay, allowing many accounts/personas to be updated without easy on-ledger association of them belonging to the same human - New transaction queue from home screen, showing and letting users manage time-delayed transactions, transactions currently being processed by the network, pre-authorizations being processed by dApps, and any transactions that have failed - Detect “out of sync” condition when user’s desired Shield configuration does not match what is applied on-ledger, allow user to easily re-sync via Shield update transaction - Much under-the-hood work to ensure consistent MFA implementation across iOS/Android, including signing flow, and managing complex hierarchical deterministic derivations from different factors so different accounts/personas do not appear on-ledger to be using the same factors - Under-the-hood support for “personal security questions” factor type, to be added in later update. - Architecture work for future expansion: expanding Radix Connect Relay to enable future wallet-to-wallet signing flows, including “social recovery” use of a friend’s phone as a factor, use of an extra phone as a factor, and multi-party controlled accounts ## Milestones URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/wallets/wallets-milestones Updated: 2026-02-18 Summary: Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Updates > Roadmap > Wallets > Milestones — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/wallets/wallets-milestones.md) Milestones are big ticket items which require significant work and often have complex dependencies and/or impact upon other parts of the stack. Items on this page are ordered in expected delivery sequence. However, just because a milestone is lower on the list doesn't mean that work isn't progressing on it. Foundational work for future milestones in the form of research, design, prototyping, or dependency implementation proceeds in parallel with progress on the current milestone. Multi-Factor Account/Persona Control and Recovery Frequently referred to simply as "MFA", this milestone will let you easily configure your accounts to use a variety of different combinations of signing factors in order to access your assets, and enables both personal and social account recovery, finally allowing you to do away with seed phrases (if you so choose). The Babylon release of the Radix Public Network already contains all the necessary capabilities; this milestone is the work necessary to support the various workflows within the Radix Wallets. You can read more about it here: https://www.radixdlt.com/blog/how-radix-multi-factor-smart-accounts-work-and-what-they-can-do (https://www.radixdlt.com/blog/how-radix-multi-factor-smart-accounts-work-and-what-they-can-do) Or here: https://learn.radixdlt.com/article/how-are-smart-accounts-created-controlled-and-recovered (https://learn.radixdlt.com/article/how-are-smart-accounts-created-controlled-and-recovered) Anticipated design includes: - Shared platform library for managing signing factors and workflows - Default MFA Setup w/ basic factor types - This device, legacy seed phrase, enter-to-sign seed phrase, Ledger, security questions, trusted contact - Account “recovery” flows - lost phone, migration to new phone, settings update - Account-specific MFA settings - Initiate recovery or lock for friend (from recovery badge) - First consumer hardware factor type - Arculus is current target - Automated batched transaction submission to preserve Account and Persona separation on ledger Push Notifications/Requests Currently, the user must manually open the Radix Wallet to receive requests (such as logins, transactions, and data requests) from dApps, and can receive them only actively connected to a dApp.  A push notification-based system can add the option to have one-tap opening of the wallet when making a request, and to enable specific dApps (and potentially contacts) to make request to your wallet asynchronously. Anticipated design includes: - User’s linked Connector extension given permission to push typical dApp requests (transactions, etc.) - Per-dApp permission to do side-channel pushes of requests - Ideally peer-to-peer permissions for push Multi-Party Signing Radix transactions already support applying multiple signatures. This may be to authorize to multiple accounts, or to provide multi-factor authorization to a given access controller as part of the multi-factor system. However currently the Radix Wallet expects to provide all required signatures on any transactions proposed to it. A multi-party signing system would allow collaborative signing of transactions between multiple parties. Anticipated design includes: - Wallet-to-wallet transaction passing for peer transactions - RadFi NBA ticket purchase style - Use cases for off-ledger logic providing its own approval of certain transaction results - dApps able to provide their own signature on transactions without releasing the wallet’s control of final signature and submission. - Passing via webRTC/deeplink/QR/NFC? Session/Streaming Mode Typical one-by-one transaction review and approval may limit the user experience for some use cases. A session or "streaming" mode in the wallet would, with the user's permission, allow ongoing transaction approvals without one-by-one wallet interaction. Anticipated design includes: - Ability to give timed, dApp-specific permission for requests/transactions for “automatic” signatures against a given account, with constraints on acceptable transactions - Likely needs to be integrated with Access Controller design update to allow nomination and signature by a limited-time, revocable temporary key without need for individual biometrics, device access, etc. Decentralized Identity / Verifiable Credentials As Web3 and DeFi mature, many uses cases will begin to require the ability of having verified information about individuals, their identity, their qualifications, and more – whether this is various types of "proof of human", AML/KYC information, or more day-to-day credentials. Using the Radix Wallet and on-ledger components and assets, we can create a system that provides composable on-ledger credentials, but without losing the individual's need for privacy and being selective about what they share with who. This has many similarities to the well-considered W3C VC/DID model (https://www.w3.org/TR/vc-data-model-2.0/) , but using Radix features to create a functional implementation that is convenient, composable, and decentralized. Anticipated design includes: - Overall system design for credential issuers, verifiers, and dApps - Radix Wallet implementation of managing and selective sharing VCs and proofs NFC Scanning Described during RadFi 2022 (https://radixdlt.com/radfi) , Near-Field Communication Scanning would give you the option for your wallet to receive requests via NFC, or to show ownership of an NFT, persona, or account to a tap-scanner, which the other system could then freely verify without requiring an on-ledger transaction. Anticipated design includes: - Asset/Persona/Account ROLA proofs via NFC tap - dApp requests or invoices and responses via NFC tap ## Wallets URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/wallets/wallets Updated: 2026-02-18 Summary: Legacy documentation: Wallets Updates > Roadmap > Wallets — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/wallets/README.md) ## Roadmap URL: https://radix.wiki/developers/legacy-docs/updates/roadmap/roadmap Updated: 2026-02-18 Summary: Curious about what's in the pipeline? You've come to the right place! Updates > Roadmap — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/roadmap/README.md) Curious about what's in the pipeline?  You've come to the right place! The Radix technical roadmap provides a high-level view of what's planned for the assorted parts of the network and surrounding products. You'll see that each area is broken down into sub-pages which cover different topics: - Milestones pages describe notable upcoming releases or big features, many of which require significant implementation effort and have complex dependencies and impact on other parts of the stack. Milestones are always ordered in predicted release order, with the top item expected soonest. - Underway pages give insight into what the associated product teams are working on right now. They're your best source to find out what's coming in the near future.  Items on these pages are not ordered; they show an unsorted list of things which are currently top-of-mind or in progress. - Backlog pages contain an assortment of ideas and features, categorized by how urgent they are currently viewed.  Items within each category are not sorted by priority. The roadmap is by no means an exhaustive list of every minor feature, nor does it call out day-to-day development activities.  Sections will be updated periodically as work items complete or when priorities shift. If you don't see a particular product feature appearing in any of its sub-pages, or you think a feature deserves higher priority than is currently shown, the absolute best way to argue your point is to provide your complete use case, either in Discord or by emailing hello@radixdlt.com (mailto:hello@radixdlt.com) with the details. This video explains the rationale behind the roadmap, and answers some common questions: https://www.youtube.com/embed/NcXHEbPaYXA?si=RVF_mleFuGsjp0gd (https://www.youtube.com/embed/NcXHEbPaYXA?si=RVF_mleFuGsjp0gd) ## Dugong (In Development) URL: https://radix.wiki/developers/legacy-docs/updates/protocol-updates/dugong Updated: 2026-02-18 Summary: Legacy documentation: Dugong (In Development) Updates > Protocol Updates > Dugong (In Development) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/dugong.md) Details - Display Name: Dugong - Logical Name: dugong - Mainnet Enactment: - Status: In Development - Epoch: TBC - Timestamp: TBC Main Changes To follow Release Notes To follow Libraries & SDKS To follow ## Cuttlefish (Live) URL: https://radix.wiki/developers/legacy-docs/updates/protocol-updates/cuttlefish Updated: 2026-02-18 Summary: Legacy documentation: Cuttlefish (Live) Updates > Protocol Updates > Cuttlefish (Live) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/cuttlefish.md) Details - Display Name: Cuttlefish - Logical Name: cuttlefish and cuttlefish-part2 (in two parts, back-to-back) - Mainnet Enactment: - Status: Enacted - Epoch: 105353 - Timestamp: 2024-12-18T10:48:58Z Main Changes - A new V2 transaction format supporting an intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) containing subintents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) , and the associated pre-authorization flows (preauthorizations) to propose them in your dApp and have them reviewed and signed in the wallet. - This is covered in depth in this twitter space (starts at 4:30) (https://x.com/i/spaces/1vAxROdXqYVKl) - This powers Anthic Flash Liquidity (https://www.radixdlt.com/blog/technical-deep-dive-how-flash-liquidity-will-work) - Getters for balances on the native Account (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) blueprint. - CryptoUtils for blake256 hashing, Secp256k1 and Ed25519 validation. - Tweak to the consensus manager min rounds per epoch to make it much harder to miss 5 minute epochs. - Various improvements to the Radix Engine implementation. Release Notes - Scrypto v1.3.0 (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/scrypto-v1-3-0.md) Libraries & SDKS Rust - Engine rust crates (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-libraries-overview.md) (e.g. scrypto, scrypto-test, radix-transactions) - 1.3.0 ( crates.io (https://crates.io/crates/scrypto) ) Typescript - @radixdlt/babylon-gateway-api-sdk - 1.9.2 ( npm (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk/v/1.9.2) ) - dApp Toolkit - 2.2.0 ( npm (https://github.com/radixdlt/radix-dapp-toolkit/releases/tag/v2.2.0) ) - Radix Engine Toolkit - Typescript support for Transaction V2 will not be available at Cuttlefish launch C# - Radix Engine Toolkit - v2.2.0 ( install guide (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-installation.md) ) Kotlin - Radix Engine Toolkit - v2.2.0 ( install guide (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-installation.md) ) Swift - Radix Engine Toolkit - v2.2.0 ( install guide (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-installation.md) ) Python - Radix Engine Toolkit - v2.2.0 ( install guide (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-installation.md) ) Go - Radix Engine Toolkit - v2.2.0 ( install guide (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-installation.md) ) ## Bottlenose URL: https://radix.wiki/developers/legacy-docs/updates/protocol-updates/bottlenose Updated: 2026-02-18 Summary: Legacy documentation: Bottlenose Updates > Protocol Updates > Bottlenose — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/bottlenose.md) Details - Display Name: Bottlenose - Logical Name: bottlenose - Mainnet Enactment: - Status: Enacted - Epoch: 105353 - Timestamp: 2024-06-07T10:31:00Z Main Changes - New Account Locker native blueprint (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/locker.md) - New API for reading a component's owner role from Scrypto - New substates that expose the current protocol-related parameters - Add recovery fee vault to the Access Controller (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/access-controller.md) , removing the need for third-party fee locking during the recovery process - Various improvements to the Account (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) and Transaction Processor (transaction-processor) native blueprints - Various improvements to the Radix Engine implementation Release Notes - Scrypto v1.2.0 (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/scrypto-v1-2-0.md) Libraries & SDKS See later protocol updates (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/README.md) . ## Anemone URL: https://radix.wiki/developers/legacy-docs/updates/protocol-updates/anemone Updated: 2026-02-18 Summary: Legacy documentation: Anemone Updates > Protocol Updates > Anemone — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/anemone.md) Details - Display Name: Anemone - Logical Name: anemone - Mainnet Enactment: - Status: Enacted - Epoch: 70575 - Timestamp: 2024-02-07T16:20:57.229Z Main Changes - Correct the creation cost for validators to 100 USD - Bring basic BLS support to Scrypto - Allow requesting TimePrecision::Second when requesting the current time in Scrypto - Tweak the pool blueprints to improve precision, and improve behaviour with non-18 divisibility resources. Release Notes - Scrypto v1.1.0 (https://github.com/gguuttss/radix-docs/blob/master/updates/release-notes/scrypto/scrypto-v1-1-0.md) Libraries & SDKS See later protocol updates (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/README.md) . ## Babylon Genesis URL: https://radix.wiki/developers/legacy-docs/updates/protocol-updates/babylon-genesis Updated: 2026-02-18 Summary: This was a major update from Olympia, and brought the Babylon engine to Radix. Updates > Protocol Updates > Babylon Genesis — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/babylon-genesis.md) Details - Display Name: Babylon - Logical Name: babylon-genesis in the node (babylon in the engine) - Mainnet Enactment: - Status: Enacted - Epoch: 32717 - Timestamp: 2023-09-28T07:05:16.176Z Main Changes This was a major update from Olympia, and brought the Babylon engine to Radix. Release Notes See later protocol updates (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/README.md) . Libraries & SDKS See later protocol updates (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/README.md) . ## Protocol Updates URL: https://radix.wiki/developers/legacy-docs/updates/protocol-updates/protocol-updates Updated: 2026-02-18 Summary: Protocol updates are named, major updates to the network, happening approximately two to three times a year. They are sometimes known as "hard forks" on other n Updates > Protocol Updates — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/README.md) Protocol updates are named, major updates to the network, happening approximately two to three times a year. They are sometimes known as "hard forks" on other networks. Protocol updates requires updates to the Node and Gateway for every node-runner of the network, and require the validator set to signal readiness (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-protocol-updates.md) . The first protocol update was known as Genesis, and subsequent protocol updates follow an alphabetical sea creature naming scheme. At the time of writing, the Babylon network has had the following updates: - Genesis (genesis) (babylon-genesis) - The migration from Olympia. This brought the second generation Radix engine, with full smart contract support - Anemone (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/anemone.md) (anemone) - Improved network time precision and various other small features/fixes - Bottlenose (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/bottlenose.md) (bottlenose) - Account Locker blueprint and various other small features/fixes - Cuttlefish (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/cuttlefish.md) (cuttlefish & cuttlefish-part2) - Subintents "pre-authorizations" and various other small features/fixes ## Updates URL: https://radix.wiki/developers/legacy-docs/updates/updates Updated: 2026-02-18 Summary: Legacy documentation: Updates Updates — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/updates/README.md) ## Scrypto SBOR Specs URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/scrypto-sbor/scrypto-sbor-specs Updated: 2026-02-18 Summary: This topic is still a work in progress Reference > SBOR Serialization > Scrypto SBOR > Scrypto SBOR Specs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/scrypto-sbor/scrypto-sbor-specs.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Payload Specs Scrypto SBOR payload has a prefix of 0x5c and max depth of 64, per SBOR extension requirement. Custom Value Models Coming soon. Custom Type Models Coming soon. ## Scrypto SBOR URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/scrypto-sbor/scrypto-sbor Updated: 2026-02-18 Summary: Legacy documentation: Scrypto SBOR Reference > SBOR Serialization > Scrypto SBOR — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/scrypto-sbor/README.md) ## Manifest Value Syntax URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/manifest-sbor/manifest-value-syntax Updated: 2026-02-18 Summary: The manifest syntax for instruction arguments forms a representation of an encoded “Manifest SBOR” value. The Manifest SBOR value is converted by the transactio Reference > SBOR Serialization > Manifest SBOR > Manifest Value Syntax — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/manifest-value-syntax.md) The manifest syntax for instruction arguments forms a representation of an encoded “Manifest SBOR” value. The Manifest SBOR value is converted by the transaction processor into “Scrypto SBOR” which is then used to make calls to other objects in the engine. A specific SBOR implementation is formed of a core of “basic” value kinds, and then implementation specific additional value kinds. When the transaction processor processes the manifest, it maps basic value kinds are mapped as-is, and remaps the Manifest value kinds to Scrypto value kinds. Basic Leaf Value Kinds Below are all the value kinds in the basic (common) SBOR model. Basic Value Kind Manifest Syntax Example Bool true false I8, I16, I32, I64, I128 5i8 -412i32 12345678i128 U8, U16, U32, U64, U128 5u8 412u32 12345678u128 String "I like a \"quoted message\"!\n" Strings must be valid unicode. The string literal follows JSON escaping rules: - Double quotes must be escaped as \”, and backslashes must be escaped as \\. - You can optionally also use the following escapes: \/, \b, \f, \n, \r, \t and \uXXXX where X are exactly four hex digits, i.e. a UTF-16 encoding. The resultant string must be valid unicode (so not unmatched surrogate pairs). If writing in Javascript, JSON.stringify(value) can be used to encode a valid string literal. Basic Composite Value Kinds Below are all the composite value kinds in the basic (common) SBOR model. Basic Value Kind Manifest Syntax Example Tuple - Represents a product type, such as a Rust tuple or struct. - Contains 0 or more child values. Examples: - Tuple(“Hello”, 43u8) - Tuple() As an example, if you want to reference a structure or tuple from Scrypto, all the three examples below would have their values written in the form Tuple(55u64, "David"): #[derive(ScryptoSbor)] pub struct Student { id: u64, name: String, } #[derive(ScryptoSbor)] pub struct StudentTwo(u64, String) (u64, String) Enum - Represents the value of a sum-type (combined with a product type), such as a Rust enum. - Technically this represents an Enum Variant value, but was shortened to Enum for brevity. - An Enum Variant consists of: - A u8 discriminator - this is typically the 0-index of the variant under the enum in Rust. - 0 or more child values, which form part of the enum variant. The canonical representation is: - Enum<1u8>() - Enum<24u8>("Hello", "world") Some engine-specific enums also have a nicer alias-syntax for their variants, eg: Enum( Enum( Enum( Enum( NonFungibleGlobalId("...") ) ) ) )The full list of supported manifest enum variant names is given here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transactions/src/manifest/manifest_enums.rs) . Option and Result also have rust-like ident aliases for their variants, and so can be represented as Some("value") / None and Ok("Success") / Err("Failure!"). If you are representing enum variants of an enum defined in scrypto, you can’t use the alias-syntax, and instead will have to rely on discriminant numbers. So you’ll need to work out the enum variant to use. EG the MouseMove variant below has index 1, and has two u32 fields, so you would write this in your transaction manifest as Enum<1u8>(320u32, 480u32). Similarly, the MousePress variant has index 3, and a single String field, so would be represented as eg Enum<3u8>("scroll-wheel"). #[scrypto(ScryptoSbor)] pub enum Event { PageLoad, MouseMove(u32, u32), KeyPress(String), MousePress { button_name: String } } Array - Represents a repeated value (such as an Array / Vec / Set / List, etc) - All contained values must be of the same value kind. This value kind is specified in the angle brackets. - This also includes bytes - which are effectively an Array, although their canonical representation is using the Bytes("") alias covered later. Array("foo", "bar") Map - Represents a map (effectively an array of two-tuples). - All contained values must have the same value kind for their keys and the same value kind for their values. These value kinds are specified in the angle brackets. Map(1u8 => 5u16, 2u8 => 7u16) You can also nest Map: Map( "Blueprint" => Tuple( Map( "method" => 1u32 ), 0u32 ) ); Manifest Value Kinds Manifest Value Kind Example Address Resolves to a reference to the given address - either fixed, or from a bound reference. If used as a NamedAddress: - The first instruction using this will bind the given name to the given created address. - In the compiled representation, names are removed and replaced with integer indexed references. - Fixed address: - Address("package_address") - Address("component_address") - Address("resource_address") - Bound named address (eg from AddressReservation): - NamedAddress("my_package") AddressReservation - Resolves to an owned AddressReservation. - The first instruction using this will bind the given name to the given created/bound AddressReservation. - In the compiled representation, names are removed and replaced with integer indexed references. AddressReservation("address_reservation") Bucket - Resolves to an owned FungibleBucket or NonFungibleBucket. - The first instruction using this will bind the given name to the given created/bound bucket. - In the compiled representation, names are removed and replaced with integer indexed references. - Named bucket, Bucket("my_awesome_xrd_bucket") - Unnamed bucket, Bucket(5u32) Proof - Resolves to an owned FungibleProof or NonFungibleProof. - The first instruction using this will bind the given name to the given created/bound proof. - In the compiled representation, names are removed and replaced with integer indexed references. - Named proof, Proof("auth") - Unnamed proof, Proof(100u32) Expression Resolves to an array of owned Buckets or owned Proofs respectively. Only the following are supported: - Expression("ENTIRE_WORKTOP") - Expression("ENTIRE_AUTH_ZONE") Blob Resolves to a Vec of the content of the blob with the given hash in the transaction intent. Blob("") Decimal Resolves as a Decimal. Decimal("-123.456") PreciseDecimal Resolves as a PreciseDecimal. PreciseDecimal("1231232342342.123213123123") NonFungibleLocalId Resolves as a NonFungibleLocalId. - String: NonFungibleLocalId("") - Integer: NonFungibleLocalId("#12123#") - Bytes: NonFungibleLocalId("[031b84c5567b126440995d3ed5aaba05]") - RUID: NonFungibleLocalId("{43968a7243968a72-5954595459545954-45da967845da9678-8659dd398659dd39}") Manifest aliases Manifest Alias Example Unit Resolves as a Tuple of length 0. () NonFungibleGlobalId Resolves as a Tuple of the resource address and the non fungible local id. Various examples: - String: NonFungibleGlobalId("${non_fungible_resource_address}:") - Integer: NonFungibleGlobalId("${non_fungible_resource_address}:#123#") - Bytes: NonFungibleGlobalId("${non_fungible_resource_address}:[031b84c5567b12643213]") - RUID: NonFungibleGlobalId("${non_fungible_resource_address}:{..-..-..-..}") Which resolves to: - Tuple(Address("${non_fungible_resource_address}"), NonFungibleLocalId("...")) Bytes Resolves as an Array via hex decoding. This is the canonical representation of a Array in the manifest. Bytes("deadbeef") Various Enum ident aliases: - Some / None - Ok / Err These resolve to the relevant enum variant. Examples: Some("Hello World") None Ok("Success message.") Err("Failure message.") Expressions At the moment, two expressions are available. Expression("ENTIRE_WORKTOP") takes all resources present on the worktop and puts them in a vector of buckets that can then be used as an argument. Expression("ENTIRE_AUTH_ZONE") works similarly but with proofs present on the AuthZone. ## Manifest SBOR Specs URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/manifest-sbor/manifest-sbor-specs Updated: 2026-02-18 Summary: This topic is still a work in progress Reference > SBOR Serialization > Manifest SBOR > Manifest SBOR Specs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/manifest-sbor-specs.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Payload Specs Manifest SBOR payload has a prefix of 0x4d and max depth of 24, per SBOR extension requirement. Custom Value Models Coming soon. Custom Type Models Coming soon. ## Manifest SBOR URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/manifest-sbor/manifest-sbor Updated: 2026-02-18 Summary: Legacy documentation: Manifest SBOR Reference > SBOR Serialization > Manifest SBOR — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/README.md) ## Programmatic JSON URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/sbor-textual-representations/sbor-programmatic-json Updated: 2026-02-18 Summary: More explicit details coming soon! Reference > SBOR Serialization > Textual Representations > Programmatic JSON — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-textual-representations/sbor-programmatic-json.md) More explicit details coming soon! To summarise: - There is a core Programmatic JSON representation of the core value kinds. - Extensions define additional values in their value kind union. - For Scrypto SBOR, see the ProgrammaticScryptoSborValue type information in the  Gateway Swagger Documentation (https://mainnet.radixdlt.com/swagger/) . ## Textual Representations URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/sbor-textual-representations/sbor-textual-representations Updated: 2026-02-18 Summary: Legacy documentation: Textual Representations Reference > SBOR Serialization > Textual Representations — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-textual-representations/README.md) ## SBOR Type Model URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/sbor-type-model Updated: 2026-02-18 Summary: The SBOR type model layers on top of the value model, and adds extra context to a particular value. Reference > SBOR Serialization > SBOR Type Model — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-type-model.md) Overview The SBOR type model layers on top of the value model, and adds extra context to a particular value. A value by itself is compact and encodes just enough information to display the value in a rudimentary form - but to nicely display or validate a value, we need extra information. This information comes in the form of a schema which captures additional information about sbor types, which can be associated with values in a particular payload. Type Identity Each type has an identity. The identity of a type in a given schema is encapsulated as a LocalTypeId and is either a: - WellKnownId - The type is "well-known" and is defined in code in the SBOR implementation, to save it being duplicated across lots of schemas - SchemaLocalIndex - The index of the type in this particular schema. Types are normalized into an array inside a schema to keep them compact and to enable recursive types. Sometimes you wish to refer to the identity of a type without having a particular schema in context. In this case there are two wider ids used in the Radix Engine, and can be used to locate the schema with which to resolve the given type: - The ScopedTypeId is (SchemaHash, LocalTypeId) - The FullyScopedTypeId is (NodeId, SchemaHash, LocalTypeId) and can be used to first locate the schema with the given hash from the schema partition under the given node; and then read the type with the given local type id. Type Data For each type, the information splits into three categories: - TypeKind - the kind of the type - There is a type kind for each value kind. - Composite types will include a link to their child type kinds. Types - TypeMetadata - Associated naming for the given type kind - An optional name of the type - If it's a tuple, optionally names of every field - If it's an enum, names of each variant, and for each variant, optionally names of every field of that variant. - TypeValidation - Associated validation for the given type kind: - Numeric types can have an upper and lower bound. - Strings can have a bound on their length. - Collection types can have a bound on their entry count. - Custom types may have their own validations - for example, in Scrypto, it might require that the node passed in an Own is a specific blueprint, such as a FungibleBucket. Example Consider the following example in Rust. #[derive(Sbor)] // Shorthand for Categorize, Encode, Decode and Describe struct Tree { root: TreeNode, cached_len: TreeLength, } #[derive(Sbor)] struct TreeNode { left: Option, right: Option, } #[derive(Sbor)] #[sbor(transparent)] struct TreeLength(usize); // A new-type for semantic purposes I could take a particular Tree and encode it into an SBOR value. In textual manifest value syntax, this might look like: Tuple( # Tree { .. } Tuple( # > root: TreeNode { .. } Enum<1u8>( # >> left: Some(..) Tuple( # >>> TreeNode { .. } Enum<0u8>(), # >>>> left: None Enum<0u8>(), # >>>> right: None ), ), Enum<0u8>(), # >> right: None(..) ), 2usize, # > cached_len: TreeLength ) The commented parts capture information which is included in the schema. The ability to create a schema from a type is captured by the Describe trait implementation, which is one of the traits included in the helper Sbor derive. In this particular case, the single type schema created for Tree as a single root type would include three schema local types: - Tree (a tuple type-kind, with named fields) - TreeNode (a tuple type-kind, with named fields) - Option (an enum type-kind, with two variants, each with unnamed fields) - TreeLength (a usize type-kind) Typically you'll want Rust new types to be unique / named in the schema. But if you don't, you can set #[sbor(transparent, transparent_name)] to make them invisible to the type model. If setting transparent_name on TreeLength then there would only be three types in the schema. Uses of schemas By combining a schema with a value, you can enable: - Validation of a payload or sub-payload against the schema - this is used extensively in the Radix engine - Annotation of a valid payload when converted to a textual representation - this is used for annotated programmatic JSON, and certain annotated manifest formats. - The Radix Gateway API does this when outputting component state, key value store entries and transaction events. Schema deep-dive Type roots Schemas have an associated set of "type roots" which are used to generate a schema and define particular types of interest in a schema. Often there is only a single "type root" in the schema, in which case it is called a SingleTypeSchema. Given a rust type, you can generate its single type schema. So colloquially this can be known as the type's schema. But there can also be multiple root types in a schema, for example in some blueprint schemas. Comparison and backwards compatibility Comparing schemas can be used to track structural changes of types and ensure types haven't changed in a way which isn't backwards compatible given the compatibility constraints on the type (e.g. the need for the type to read data from a Database encoded with a previous structure). This can be automated with the ScryptoSborAssertion (https://docs.rs/scrypto-test/latest/scrypto_test/prelude/derive.ScryptoSborAssertion.html) derive which can generate tests to ensure that types are only changed in ways which are suitably backwards compatible. ## SBOR Value Model URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/sbor-value-model Updated: 2026-02-18 Summary: You can think about the SBOR Value model as like a more flexible, binary-friendly JSON model. Reference > SBOR Serialization > SBOR Value Model — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-value-model.md) Overview Mental model You can think about the SBOR Value model as like a more flexible, binary-friendly JSON model (https://www.json.org/json-en.html) . A similar diagram covering the grammar of the binary encoding of an SBOR payload is in the Grammar section below. The SBOR value model is a discriminated union of values, discriminated by their value kind. In other words, each SBOR value has a specific value kind, and the data specific to that kind. An SBOR value can be thought of as a tree, with parent nodes being composite values such as Tuples/Arrays etc, and leaf nodes being basic data, or empty composite values. - A value kind which cannot contain children is known as a "leaf value kind" - A value kind which permits children is known as a "composite value kind" or in some cases "container". Each value kind has an associated discriminator byte used in the binary encoding to tell a decoder what the kind of a given value is. These can be seen in Rust here (https://github.com/radixdlt/radixdlt-scrypto/blob/6d35fe85de69d82b85700aaa9a68310a6163b72e/sbor/src/value.rs#L22-L78) . Core and Extension Value Kinds There are a set of value kinds common to all extensions, these are known as the core value kinds. These core value kinds take inspiration from the Rust data model. Each extension (such as Scrypto or Manifest SBOR) have their own set of custom value kinds, which can have a binary discriminator starting from 0x80. At present, all custom value kinds of supported extensions are leaf value kinds. In practice, custom value kinds could be composite, although some tooling currently assumes they are leaf only. Core Value Kinds Leaf Value Kinds Value Kind Byte Discriminator Byte Data Bool - A boolean value 0x01 0 or 1 I8 - A signed 8-bit integer 0x02 The value I16 - A signed 16-bit integer 0x03 Big-endian I32 - A signed 32-bit integer 0x04 Big-endian I64 - A signed 64-bit integer 0x05 Big-endian I128 - A signed 128-bit integer 0x06 Big-endian U8 - An unsigned 8-bit integer 0x07 The value U16 - An unsigned 16-bit integer 0x08 Big-endian U32 - An unsigned 32-bit integer 0x09 Big-endian U64 - An unsigned 64-bit integer 0x0a Big-endian U128 - An unsigned 128-bit integer 0x0b Big-endian String - A UTF-8 string 0x0c UTF-8 Floats are not supported in the core model Note that floats are not part of the core value model. This is because SBOR was designed for use with the Radix financial engine, which had no need for float data, for two reasons: - Execution has to be deterministic, and some floating point operations can have differing behaviour across different processors, even inside WASM. - Floating point numbers are rarely in financial applications (instead, Scrypto uses a fixed precision Decimal construct). Composite Value Kinds These value kinds allow construction of more complex types. The encodings of Arrays and Maps "lift up" the value kinds of their children, to avoid duplication and make certain operations more concise and performant (for example, enable a Rust Vec to be copied into an SBOR Array). Value Kind Byte Discriminator Byte Data Array - Any number of ordered elements of the same value kind. In Rust, this can correspond to [T] or [T; N] or any iterable collection 0x20 The discriminator of the value kind, then the LEB128-encoded element count of the array, followed by each value without its value kind discriminator Tuple - A general product type: An ordered list of elements of possibly different value kinds. This corresponds to a Rust tuple or struct 0x21 The LEB128-encoded item count of the tuple, followed by each value with its value kind discriminator Enum - A general sum type / discriminated union with a tuple-like payload. This corresponds to a Rust enum. 0x22 A byte for the enum's discriminator, followed by the its tuple data, i.e. the LEB128-encoded item count of its data tuple, followed by each value with its value kind discriminator Map - Any number of ordered key-value pairs of elements where all keys have the same value kind and all values have the same value kind.The value model allows duplicate keys to be present.In Rust, this can correspond to a Vec<(Key, Value)> or an IndexMap. 0x23 The discriminator of the key value kind, then the discriminator of the value value kind, then the LEB128-encoded entry count of the map, followed by each entry (i.e. key then value without their value kind discriminators) Value Model Caveats The following are some interesting caveats to understand when working with the value model, and understanding its invariants. Child types of arrays and maps must have a fixed value kind (i.e. must implement Categorize) In Rust, to have a Vec or an IndexMap implement SBOR traits, it's required that X and Y have a fixed value kind. This is given by the Categorize trait. Types which don't have a fixed value kind (such as an arbitrary ScryptoValue) do not implement Categorize and can't be put into an array or map directly. If you wish to have an array item or map key/value be an arbitrary SBOR value, you will need a workaround. The following workarounds are possible: - Use a singleton tuple wrapper, e.g. Vec<(ScryptoValue,)> - the Categorize constraint only applies to the tuple and you can put anything you want inside the tuple. This works ok with the type system, but is slightly wasteful in the codec, and looks a little confusing. - Use a more specific type. e.g. instead of the child being an arbitrary SBOR value, maybe it could be an arbitrary tuple. That is, instead of IndexMap (which is a compile error because ScryptoValue doesn't implement Categorize) you could instead use an IndexMap. - Use a tuple instead of an array. This works well in the value model, but doesn't play well with the type model, which expects a tuple to have a fixed length. Child type constraints are only one level deep In the value model, only the value kind of child types are constrained. That means things like the following are allowed: - Arrays of tuples of distinct types - Jagged Arrays The SBOR type model can be used to specify / add additional constraints on the data. Arrays and Maps are inherently ordered To ensure a value can be round-tripped uniquely, the value model only concerns ordered values. If the round-trip property is important, ordered sets and maps such as IndexSet and IndexMap are recommended to maintain the ordering. When a non-ordered map is encoded, its entries are first ordered by key, which can slow things down for large maps. Arrays and Maps permit duplicate items/keys A set can be encoded as SBOR arrays and a map can be encoded as an SBOR map - but the SBOR array and map model does not consider key identity or uniqueness. Instead, a particular codec for a Map or Set may choose to enforce this at encoding and/or decoding time. Tuples and enums care about field ordering and not field names If encoding a Rust struct or enum, note that there isn't any sense of "field name" in the value model for tuples or enums. The name of fields is instead captured in the schema at the type model layer. If you wish to maintain a string key in the data, you could choose to use a Map instead. Grammar The grammar for the canonical binary encoding of SBOR is defined below. This is covered in Rust here (https://github.com/radixdlt/radixdlt-scrypto/tree/main/sbor/src) . ## What is SBOR? URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/what-is-sbor Updated: 2026-02-18 Summary: This topic is still a work in progress Reference > SBOR Serialization > What is SBOR? — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/what-is-sbor.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. SBOR is an open-source standard for modelling, interpretation and representation of data. The term SBOR was originally an acronym coined from “Scrypto Binary-Friendly Object Representation”, and was designed to give a great developer experience for defining blueprint interfaces in Scrypto, but is now much more than a binary data format, and is used across the Radix Engine for transaction encoding, state storage and inter-frame communication in the Radix Engine. SBOR is a key technology in the Radix stack, and allows the Radix Engine and APIs to understand the meaning/intention of data. This enables top-tier experiences across the entire stack, for both dApp builders and end users. From interface validation and code-gen in Scrypto, through a natural JSON representation of dApp state in the Browse API, all the way to structured, intuitive representations of data in the wallets and dashboards. What’s in the SBOR standard? The SBOR standard encompasses a powerful logical model. The logical model includes a core set of value kinds and types, as well as in-built support for defining extensions. The Radix Engine uses this extension paradigm to define bespoke variants of SBOR, tailored to specific needs in the Radix protocol. The SBOR standard also covers representations of the value model, including the canonical binary format which gives SBOR its name, as well as textual representations, and a choice of JSON formats. The logical model has two main parts: - A Value model (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-value-model.md) , which defines how data can be represented in SBOR. There are 16 core value kinds, covering Strings, Integers, Enums, Arrays, Tuples, Maps. Extensions can define additional custom value kinds, specific to their use case. - A Type model (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-type-model.md) , which is used to represent concrete types used by the programmer. Types have a structure which defines how it can be represented in the value model and its relation to other types. Types additionally include names/metadata and validation instructions. The core and each extension define a set of “well-known” types, baked into the standard. User-defined types can be packaged into a schema. The Radix Engine currently defines three SBOR variants: - Basic SBOR is primarily used for testing of the SBOR standard. It uses an empty extension. - Scrypto SBOR is used for communication between actors in the engine, and for storing state. The Scrypto SBOR extension adds new value kinds to communicate entity ownership and reference semantics. It also adds a custom schema, capturing various validations which the Radix Engine can perform. - Manifest SBOR is used for the Radix transaction. The Manifest SBOR extension includes representations of placeholders in the manifest. The Manifest Extension doesn’t have its own type/schema model, and actually shares the Scrypto schema. This allows component calls in manifests to be interpreted and validated against scrypto schemas. The standard also defines standardized representations of the value model. All variants share the same representation for core value kinds, but can have extension-specific representations of custom value kinds. The SBOR standard also includes standardized representations of the value model: - A unique serialization of the value model to/from bytes. - Various string representations of the value model, some of which also make use of information in the Schema: - RustLike - Takes inspiration from Rust and other programming languages. Uses optional type context for a more compact format. - NestedString - Deprecated, intended to be like the manifest format. - Manifest Value Syntax (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/manifest-value-syntax.md) - Not a standardized representation, but used when creating Manifests. - Various JSON representations, some of which also make use of information in the Schema: - Programmatic JSON (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-textual-representations/sbor-programmatic-json.md) is stable / standardized. - Natural JSON is intended as a JSON-native format, relying on type context for a more compact / intuitive format - is half-implemented - but in beta, and subject to change. - Display JSON is intended for use creating tree-based UIs from SBOR with type context. - Model JSON is deprecated, replaced by Programmatic JSON. ## SBOR Serialization URL: https://radix.wiki/developers/legacy-docs/reference/sbor-serialization/sbor-serialization Updated: 2026-02-18 Summary: Legacy documentation: SBOR Serialization Reference > SBOR Serialization — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/README.md) ## Subintents URL: https://radix.wiki/developers/legacy-docs/reference/transactions/subintents Updated: 2026-02-18 Summary: A subintent (also known as a pre-authorization in the dApp/wallet pre-authorization flow(preauthorizations)) can be thought of as being its own independent mini Reference > Transactions > Subintents — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) A subintent (also known as a pre-authorization in the dApp/wallet pre-authorization flow (preauthorizations) ) can be thought of as being its own independent mini transaction: - It has its own manifest, its own messages and its own intent header describing when it can be valid - It has its own id, the hash of the subintent, and is typically encountered bech32-encoded starting with subtxid_.... Subintents can only be committed as part of a transaction. Each subintent has a parent, and zero or more of its own subintent children, as part of a transaction's intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) . The intent structure article also explains how subintents can interact with their parents and children, through yielding and passing buckets. Execution A subintent starts execution by being yielded to from its parent for the first time. At the start of execution, and every time it YIELDS (intent-structure.md#yielding) to a parent or child, it may end up with buckets on its worktop from the other intent. In a valid transaction: - Every subintent must end with a YIELD_TO_PARENT. - Every YIELD_TO_PARENT instruction in a subintent must match with a corresponding YIELD_TO_CHILD in its parent (and vice versa). These guarantee that in a successful execution, every subintent will be executed in its entirety. Self-contained Subintents Sometimes, subintents can be considered by a wallet to be equivalent to a transaction manifest, because it doesn't receive anything from its parent. This is known as it being "self-contained", and allows the wallet to offer a better user experience using conforming manifest classification (conforming-transaction-manifest-types) and preview. This comes up in the "delegated fee payment" use case, where a dApp might pay to submit a transaction containing the user's subintent, without interacting with it. To be specific, a subintent is "self-contained" if it: - Starts with a ASSERT_WORKTOP_IS_EMPTY - This ensures the subintent receives no resources from its parent at the start of its execution - Contains a body with no additional YIELD_TO_PARENT or YIELD_TO_CHILD instructions - These are technical limitations which make it possible to use preview on the contents - Finishes with a YIELD_TO_PARENT instruction with no arguments - This ensures the subintent gives nothing to its parent Structure For full details, see the transaction structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) article, but to summarize, a subintent contains an intent core, which includes: - An intent header, capturing its validity constraints (e.g. min/max epoch and optionally a min/max proposer timestamp). The subintent can only be included in a transaction (and so can only be committed) during its validity window. - An optional message - Its transaction manifest (transaction-manifest) , including its instructions; blobs; and zero or more child subintent hashes Construction and Serialization A subintent can be constructed with a Partial Transaction Builder. This can construct a PartialTransaction - a sub-tree of the intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) , with a subintent at the root. Typically layers are signed before being passed on to other layers, so the partial transaction builder typically outputs a SignedPartialTransaction, for use in the signed_child step of a parent transaction builder or partial transaction builder. When passing subintents to an aggregator, they are also typically passed around as a SignedPartialTransaction. Or more specifically the canonically encoded raw bytes of a SignedPartialTransaction. This is available with to_raw() in Rust, or to_payload_bytes() in the toolkit. A partial transaction builder is provided: - In the radix-transactions (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-libraries-overview.md) crate, if building in Rust. - In the radix-engine-toolkit (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/README.md) , if building in other UniFFI-based stacks. As of December 2024, it isn't possible exposed in the Typescript Builder. Behaviour Compared with transaction intents, a subintent behaves a little differently: - A subintent cannot lock uncontingent fees (only contingent fees). This is the responsibility of the transaction intent. - A subintent can be included in a transaction intent which is committed as a failure, and still be committed as a success in some other transaction intent after. This can be summarized as: "subintents are finalized only on success". Aggregation & Matching The network only has mempools for valid complete transactions, and does not concern itself with how subintents are combined together or "aggregated" into sensible complete transactions. Aggregation is out-of-band of the protocol, and instead handled by dApps. We expect the wider ecosystem to launch aggregators and matches might be created on top of the subintent primitive supported by the network. ## Transaction Notary URL: https://radix.wiki/developers/legacy-docs/reference/transactions/transaction-notary Updated: 2026-02-18 Summary: The concept of notarization was introduced at Babylon as a means to support seamless multi-signature transactions. Reference > Transactions > Transaction Notary — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-notary.md) The concept of notarization was introduced at Babylon as a means to support seamless multi-signature transactions. How is it used? In the transaction structure (transaciton-structure) : - The transaction header defines a notary_public_key, and optionally whether the notary_is_signatory, which is an optimization so that the notary counts as a transaction intent signatory. - This key must "notarize" the signed transaction intent to confirm the signatures are correct, by adding its own notary signature to the signed transaction intent. This creates the "notarized transaction" which can be submitted to the network as a user transaction. Who is the notary? Generally there will be one party who is orchestrating the transaction construction/signing/submission process. This party is a natural notary. For example: - In many self-sign scenarios, the wallet itself can be the notary, and make use of the notary_is_signer optimization. - A custodian or signature-aggregator which orchestrates a threshold signature process could be a natural notary. - In future wallet-to-wallet scenarios (e.g. recovery of an account using a contact as a recovery factor), one phone will act as a proposer/notary, and the contact will act as a simple transaction intent signer, and pass back an intent signature to be added to the transaction. Motivation for the notary concept Compared to many other networks, the Babylon engine: - Doesn't have a clear "signer" or "fee payer" on a transaction. - Is designed so that the presence/absence of signatures can cause the transaction to fail (via the auth model (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/README.md) and implicit signature requirements (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) ); but does not affect the execution of the transaction. - In particular, you can't read the signatories inside the transaction. - This has benefits in terms of readability: the intent itself captures the content of the transaction. - It allows makes preview more powerful and accurate: we can support flags to disable authorization without affecting control flow. - In certain threshold signature scenarios with complicated authorization rules, there may be multiple combinations of signers which could correctly allow the transaction to be successful. We wanted our transaction design to: - Support sending concurrent transactions without worrying about interaction between these transactions where it wasn't required (i.e. to avoid the "account nonce" feature) - Prevent tampering of the signatures which could cause a transaction payload to fail unfairly and waste the fee We also had other requirements for the transaction design (e.g. to avoid adding permanent bloat to the ledger state) - these further aspects of the design are captured in the Transaction Tracker (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/transaction-tracker.md) reference. The notary concept solves these problems: - The notary signature prevents tampering of the intent signatures - All standard signers agree that the notary key is given limited power over the transaction to control the signing and submission process What can the notary do? The notary has the power/responsibility to: - Select a correct set of signatures to include in the transaction. - If the notary were malicious or incompetent, they could choose incorrect signatories which could cause the transaction to fail, and waste the fee, or they could choose too many signatories which would waste fees in transaction validation costs. - Finalize the transaction for submission... or decide not to. - Attempt to invalidate the transaction if it is stuck in the mempool post-submission and hasn't been committed yet. - This is not yet supported as of Cuttlefish, but the Transaction Tracker was designed with this in mind. - For now, we advise transaction submitters to set a short expiry window with an epoch (cheap) or timestamp-based expiry (slightly more expensive). What can't the notary do? Due to the design of the Radix Engine, the presence/absence of the signatures shouldn't have any direct bearing on the result of a successful transaction, only whether it succeeds or fails. A notary cannot: - Replay the transaction intent: A notary is able to theoretically submit multiple submissions for the same intent. But these will all have the same transaction id (i.e. transaction intent hash), and the engine will guarantee only one of these will commit. - Affect the result of a successful transaction: A notary can add the wrong signatures, or add cost to the transaction by adding more signatures, but apart from that, cannot cause the transaction to succeed with different behaviour. This is because the Radix Engine ensures that the exact identity of the signatures is not readable in the transaction (outside of an authorization assertion which would cause the transaction to reject/fail). ## Transaction Intents URL: https://radix.wiki/developers/legacy-docs/reference/transactions/transaction-intents Updated: 2026-02-18 Summary: Each transaction has a single transaction intent. Its transaction id is the hash of the transaction intent, and is typically encountered bech32-encoded starting Reference > Transactions > Transaction Intents — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-intents.md) Each transaction has a single transaction intent. Its transaction id is the hash of the transaction intent, and is typically encountered bech32-encoded starting with txid_.... Structure For full details, see the transaction structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) article, but to summarize, a transaction intent contains: - A transaction header, defining the notary and tip. - Its intent core, including its: - An intent header, capturing when it's valid for - An optional message - Its transaction manifest (transaction-manifest) - Zero or more subintents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) Behaviour Compared with subintents, the transaction intent is special: - Only the transaction intent is allowed to lock a non-contingent fee, and only the transaction intent defines the tip. - If a transaction execution fails after a fee is locked, then the transaction intent is recorded as a committed failure, and can never be committed again. Subintents by contrast can be committed zero or more times and still be committed successfully. - If a notary is marked as a signatory, the notary only counts as a signatory of the transaction intent. ## Transaction Structure URL: https://radix.wiki/developers/legacy-docs/reference/transactions/transaction-structure Updated: 2026-02-18 Summary: This article covers the flattened structure of the transaction model and serialized transactions. Reference > Transactions > Transaction Structure — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) This article covers the flattened structure of the transaction model and serialized transactions. When building or executing a transaction, it is better to think in terms of its tree-based Intent Structure. We advise you to read about the intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) first before reading this article. V1 User Transaction Structure It is much the same as the V2 structure, except has no support for subintents, in particular: - V1 transactions have a single intent. This was split into the transaction intent and intent core concepts in V2. - V1 transactions have a single header. This was split into the single Transaction Header and per-intent Intent Header in V2. V2 User Transaction Structure In the persisted structure, the subintents are flattened into an array. During validation, they are converted into a tree structure. When building transactions, you will build them up in this tree-based intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) . Transaction hashes discussed below are built up in merklized layers, such that a transaction hash can be used to prove details about the content of a transaction without providing the full transaction. Notarized Transaction The notarized transaction is the fully signed transaction, and is represented by its NotarizedTransactionHash (notarizedtransaction_...). It contains: - The signed transaction intent - The notary's signature of the signed intent hash. Note that the notary may or may not count as a signatory of the transaction, depending on the notary_is_signatory flag in the transaction header. Signed Transaction Intent The signed transaction intent is represented by its SignedTransactionIntentHash (signedintent_...). It contains: - The transaction intent - The signatures of the root transaction intent (of its transaction intent hash) - A flattened array of signatures for each included subintent (of its subintent hash) Transaction Intent The transaction intent is represented by its TransactionIntentHash, shown to users as the transaction id (txid_...). The Radix Engine guarantees each transaction intent can be committed at most once (as either a success or failure). It contains: - The transaction header, capturing: - The notary_public_key which must notarize the transaction - A flag notary_is_signatory which determines if the notary counts as a signatory of the root transaction intent core - The tip_basis_points - The root transaction intent core - A flattened array of subintents, each containing just their intent core. Subintent Each subintent is encapsulated by its SubintentHash (subtxid_...), sometimes shown to users as its preauthorization id. The Radix Engine guarantees each subintent can be successfully committed at most once. It can be committed in a failing transaction zero or more times before that, but is unable to pay fees. Each subintent only contains its intent core. Intent Core An intent core is common to both the transaction intent and each subintent. A transaction can therefore contain one or more intent cores. Each intent core contains: - An intent header, capturing: - The network_id which must match the network - The min_epoch_inclusive and max_epoch_exclusive in which the intent can be committed - An optional min_proposer_timestamp_inclusive and optional max_proposer_timestamp_exclusive between which the intent can be committed - An intent_discriminator, typically set to a random number - which allows generating a new transaction id for what could otherwise be the same intent. - An optional message (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-transactions.md) - The intent's manifest (transaction-manifest) - One or more blobs (uninterpreted byte payloads which can be passed to smart contracts in the manifest) - An array of the subintent hashes of each child - A list of instructions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) , which can be represented in a human readable form V2 Partial Transaction Structure When constructing transactions containing more than one intent, it's typical to build them up in subtrees, according to their intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) . In particular, there are two additional useful models: Signed Partial Transaction A signed partial transaction is an equivalent of a signed transaction intent, but with a subintent root. It contains: - The root partial transaction - The signatures of the root subintent (of its subintent hash) - A flattened array of signatures for each included non-root subintent (of its subintent hash) Partial Transaction A partial transaction is an equivalent of a transaction intent, but with a subintent root. It contains: - The root subintent - A flattened array of non-root subintents ## Intent Structure URL: https://radix.wiki/developers/legacy-docs/reference/transactions/intent-structure Updated: 2026-02-18 Summary: This article explains the tree-based structure of intents in a transaction. If you want to learn about the persisted structure of a transaction, read about the Reference > Transactions > Intent Structure — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) This article explains the tree-based structure of intents in a transaction. If you want to learn about the persisted structure of a transaction, read about the Transaction Structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) after reading this article. Intent Tree In any transaction, there is a single transaction intent (transaction-intent) and zero or more subintents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) . Together, these are called the intents of a transaction. In a valid transaction, these intents form a tree, with the transaction intent at the root, and subintents below, in layers. Each intent has zero or more child subintents, and each subintent has a single unique parent. Interaction between intents Each intent declares its children as pseudo-instructions as the start of its manifest: USE_CHILD NamedIntent("my_child") Intent("subtxid_sim1lh5la66jj3dwl69z2cjjf0hphaj90yl5l5xnd7s8mxx273tkhw2qer299e") ; There are currently two ways intents can interact: Yielding Intents can yield control to their direct child or parent. This passes control to that intent, and can also be used to pass buckets. These buckets end on the worktop of the other intent, before their execution is resumed. The specific manifest instructions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) are: - Intents can use YIELD_TO_CHILD NamedIntent("xxx") ; to yield to their direct child. - Subintents can use YIELD_TO_PARENT ; to yield to their direct parent. For a transaction to be valid: - Every YIELD_TO_PARENT instruction in a subintent must match with a corresponding YIELD_TO_CHILD in its parent (and vice versa). - Every subintent must end with a YIELD_TO_PARENT. This ensures that a successful transaction must have executed the content of every subintent in its entirety. Verifying the direct parent intent Sometimes, when constructing a subintent, you only want it to be used by a particular counterparty. For example, it could be used by dApps to give users or regulated integrators guarantees who will consume the subintent. This is where the "verify parent" check comes in. Subintents can use the VERIFY_PARENT ; manifest instruction (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) to assert against the auth zone (authzone) of the parent intent's processor, which can see: - Signatures of the parent intent (using a signature requirement (../../build/scrypto-1/auth/advanced-accessrules.md#signature-requirements) ) - Proofs (e.g. of badges) created during execution which are currently on the parent's auth zone Limits As of Cuttlefish, the following limits apply: - A transaction can only have an intent depth of 4 (a transaction intent root, and three additional levels of subintents) - A transaction can have a maximum of 32 total subintents, and 64 signatures across these subintents - Each manifest can have a maximum of 1000 instructions Constructing transactions with multiple intents To be able to construct transactions with multiple intents, it's easiest to build up subtrees of the transaction. These are represented by the following models (see also Transaction Structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) ): - A PartialTransaction as a subtree of a transaction, with a subintent root. It is a subintent-analogue of a TransactionIntent - A SignedPartialTransaction as a PartialTransaction, with signatures for each subintent in the transaction. It is a subintent-analogue of a SignedTransactionIntent. These can be constructed with a Partial Transaction Builder, available in the Rust code as PartialTransactionV2Builder or in the UniFFI toolkits. ## Conforming Manifest Types URL: https://radix.wiki/developers/legacy-docs/reference/transactions/manifest/conforming-manifest-types Updated: 2026-02-18 Summary: Transaction manifests make it possible for users to know the things that are important to them when signing. However, raw manifests are too complex for the aver Reference > Transactions > Manifest > Conforming Manifest Types — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/conforming-manifest-types.md) Introduction Transaction manifests make it possible for users to know the things that are important to them when signing. However, raw manifests are too complex for the average user to directly read and so the Radix Wallet, Radix Dashboard, and other client software show users a summarized transaction review UI for things they are about to sign. Because manifests enable a virtually unlimited range of complex actions using potentially multiple Scrypto components, summarization may seem like a near-impossible task. Fortunately, the manifest’s security model means we can get all the information we need for the user-relevant parts of the transaction’s results by parsing only the manifest itself – not what happens within the Scrypto logic of component calls. This is possible because, in the Radix transaction manifest model, movements of assets between the user’s account and other components happen in the transaction manifest itself. Scrypto components can just be treated as black boxes that accept resources and data as inputs, and return resources and data as outputs. We don’t really have to care about those inputs and outputs, because ultimately what matters to the user are the deposits, withdrawals, and other interactions with their accounts – not the details in between. This means we can summarize the vast majority of transactions in normal use cases with a set of simple “transaction type” patterns of transaction manifest construction that focus on the things that matter to users. Each of those transaction types can have their own focused UI presentation in the wallet. This document describes the list of transaction type patterns that the Radix Wallet plans to support and how they are identified. Developers can use this information to guide how they build transaction manifest stubs to send to user wallets to provide a smooth user experience. Wallet Handling of Manifests The Radix Wallet will handle proposed transaction manifest stubs from dApps differently depending on its contents. Below are the different ways in which manifests may be handled. Conforming and Non-Conforming Manifests The general principle of transaction types is this: If a given transaction is described by any of the transaction types described below, it is considered “conforming”. Otherwise, it is considered “non-conforming”. - Conforming manifests, matching at least one of the transaction type patterns, can be easily summarized and presented in user-friendly UI by the Radix Wallet. - Non-conforming manifests leave the wallet no choice but to fall back to an “advanced mode” that essentially displays the raw transaction manifest that is to be submitted to the network. In advanced mode, the user still has the power to examine the transaction in advanced view in full detail, but the wallet is unable to summarize it elegantly like it can for conforming transactions. Developers are thus encouraged to avoid the pitfalls that would cause their transaction manifests to be non-conforming, and ensure their users have a good user experience when interacting with their dApps using the Radix Wallet. Conforming manifests are not, however, intended to constrain transaction manifest construction to the point of limiting utility. Every reasonable use case we can think of (including complex multi-component composed transactions) can be served well by a conforming manifest, if the developer heeds some simple guidelines. In fact the vast majority of normal DeFi transactions are handled well by the “general transaction” type (see below) – most of the other types are intended to cover more unusual and specific things a user may need to only occasionally, or typically do only using the Radix Wallet itself. More conforming transaction types may be added in the future, expanding the range of what Radix Wallet can present nicely to the user without falling back to a more raw presentation. Wallet-unacceptable Manifests Radix Wallet expects to make its own additions to the manifest on the user’s behalf before signing and submitting it. These additions allow the user to assert their own control over their accounts, assets, and expected results of transactions. This means that there are some reserved manifest instructions that may cause a transaction to be considered unacceptable by the wallet and be rejected. Parsing of transaction manifests, to see what transaction types they might match, occurs before the wallet makes its own additions. To be more accurate, parsing is done on a preview receipt of a transaction manifest stub submitted by a dApp to the wallet. The workflow is as follows: - dApp sends a request to the wallet with a TX manifest stub - Wallet performs a preview of this TX manifest stub, receiving a preview receipt - Wallet analyzes this preview receipt (for a number of things, including transaction type) - Wallet makes its own additions to the TX manifest stub, creating the final TX manifest - TX manifest is signed, notarized, and submitted to the network The additions the wallet will make include calling one of the user’s accounts to lock any required transaction fee and calling any of the user’s access controllers required in order to be authorized to make account calls that are included in the transaction. dApps do not need to insert these instructions into the manifest stubs they create. Indeed attempting to do so will generally be a reason for the wallet to reject them out of hand. In fact many other method calls to accounts and access controllers are also grounds for rejection, if the transaction doesn’t conform to a transaction type that allows the wallet to summarize it clearly for the user. The following manifest instructions will cause a manifest stub to be rejected by the wallet if it is non-conforming: - Account lock fee method calls - Account or Identity securify method calls - Account or Identity sets or locks of the owner_keys metadata field - Access Controller method calls (all) Conforming Manifest Types Below is the current list of conforming transaction manifest types. These transaction manifest types are presented in reverse priority order. This is important because a transaction might match more than one type (particularly with the “general transaction” type which provides a flexible fallback for many types). This means that the Radix Wallet should prefer transaction types that it supports that are lower down the list. Transactions lower down the list are “more specific” - meaning that those summaries will be more informative than one further up the list that is more general. For each type, we list the commands that are allowed in the manifest, assuming that anything not explicitly listed falls outside that type (other than the common manifest instructions listed above). 0) General Transaction (final fallback if no other types are matched) The first type is the most general, providing a good summary for many kinds of transactions if nothing more specific is matched below. The general transaction is intended to match just about any sort of typical arbitrary DeFi interaction, and summarize it in a way that tells the user what matters to them. The majority of transaction manifests built by dApps will be of the general transaction type. A general transaction may include any of these instructions in the transaction manifest stub: - Any number of withdrawals from account components - Any number of calls to non-account components - Any number of deposits to account components (either of defined quantity, or undefined) - Any number of proofs produced from account components If the transaction is made of these elements, it can be succinctly and consistently summarized in a view like this: Such a view encapsulates everything that matters to the user in such a transaction: They know what they lose, what they gain, what they’re interacting with, and what proofs of their own assets were required to accomplish it. Deposits that are of undefined quantity can be estimated (via transaction preview) and assert statements added to guardrail them. The transaction might also include movements of assets between non-account components, but these details are safely ignored in the summary view because only the user’s own assets and accounts ultimately matter to them when deciding whether or not to sign. Even though this transaction type is very flexible, there are still many manifest instructions that will cause it to fall out of the general transaction category. In some cases, a more specific transaction type (as listed below) might be required for certain instructions to be allowed in a conforming manifest. Things that would fall outside the general transaction category include: - Updates to metadata, access rules, or royalties - Stake, unstake, or claim calls to validators - Package deployments Allowed instructions in General manifests In addition to the transaction manifest instructions listed above, a variety of utility manifest instructions are allowed for the general transaction type, to make it as flexible as possible for normal dApp interactions. These do not affect user-relevant results and so they can be safely included without changing the nature of the normal user summary of a general transaction. The following instructions are allowed: - All resource assertions (e.g. ASSERT_WORKTOP_CONTAINS) - All bucket instructions except burning (e.g. TAKE_FROM_WORKTOP) - All proof instructions (e.g. PUSH_TO_AUTH_ZONE) - All function calls - Most main-module method calls to accounts: withdraws, non-refund deposits, creating proofs and locking fees. The refund methods are disallowed because their deposit behaviour isn’t easy to present to the user. - Public main-module method calls to validators: stake, unstake and claim_xrd (although if you’re just doing this action, try to conform to the specific type for slightly better display in the wallet) - Standard main-module method calls to pools: contribute and redeem (although if you’re just doing this action, try to conform to the specific type for slightly better display in the wallet) - All main-module method calls to scrypto components - All main-module method calls to account lockers - Allocating new addresses (ALLOCATE_GLOBAL_ADDRESS) - Burning resources (BURN_RESOURCE) Disallowed instructions in General manifests The following instructions are disallowed: - Calling methods on: - Packages (e.g. claiming package royalties) - Pools - restricted_deposit and restricted_withdraw as these are owner methods. - Validators - non-public methods - Access Controllers - instead, see the dedicated conforming manifest types below. This will change soon, although calls to the Access Controller are still rejected instructions and can only be added by the wallet. - Resources (e.g. mints) - Calling non-main-module methods (CALL_ROYALTY_METHOD, CALL_METADATA_METHOD, CALL_ROLE_ASSIGNMENT_METHOD, CALL_DIRECT_VAULT_METHOD and their associated aliases (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) ) - Interacting with other intents (YIELD_TO_PARENT, YIELD_TO_CHILD and VERIFY_PARENT) 0.1) General Subintent For pre-authorization (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/pre-authorizations-and-subintents.md) summaries to be shown in conforming form in the wallet, their manifest has to have a “General Subintent” type. A manifest is classified as a General Subintent (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) if: - It has at least one YIELD_TO_PARENT instruction. - Each instruction is either a supported instruction in the General category, or YIELD_TO_PARENT or VERIFY_PARENT. Note that a General Subintent is currently not allowed to have children itself. Optimizing pre-authorization display in the wallet The pre-authorization display focuses on statically guaranteed resource movements. To optimize display of your manifest stub in the wallet, you should try to ensure that the static guarantees are very clear, by using certain manifest instructions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) to constrain what is possible. - If your manifest is self-contained, and doesn’t interact with its parent, then the subintent is transaction-like and the wallet can give a best display using transaction rules and preview. To achieve this, start with ASSERT_WORKTOP_IS_EMPTY, end with an empty YIELD_TO_PARENT and contain no other subintent-only instructions. - Otherwise, you will need to add assertion instructions to statically assert/constrain the resource deposits, to give your users more certainty. These assertions are also crucial to ensure that your user’s expectations are met when the subintent is included in a transaction, and the user gets what they want. Bear in mind that: - At the start of the subintent, you may receive arbitrary resources from the parent intent - At every dApp call and every YIELD, you may receive arbitrary resources from the parent intent If you’re not careful, users may see warnings like this, because they have approved a possible deposit of unspecified resources. There are a lots of patterns that can be used to constrain the account (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) deposits. The following are some examples of possible structures for this: - If you know an exact set of resources that will be deposited, then: - If you expect exact amounts, then create buckets with exact amounts with the TAKE_FROM_WORKTOP command - If you have a minimum bound on amounts, use TAKE_ALL_FROM_WORKTOP for a resource, and either: - Use ASSERT_WORKTOP_CONTAINS to put a minimum bound on the worktop amount before the bucket is created - Use ASSERT_BUCKET_CONTENTS to put a minimum bound on the bucket once created (or more complicated bounds). - And then deposit exact buckets with deposit [account] Bucket(...) or deposit_batch [account] Array(...) - Otherwise, if you have some flexibility in what gets deposited, and you have to use deposit_batch [account] Expression("ENTIRE_WORKTOP") then you will need to constrain things further. You have a couple of options. Either: - Use ASSERT_WORKTOP_RESOURCES_ONLY to constrain the resources to an allow list and attach bounds to them then deposit Expression("ENTIRE_WORKTOP") - Use ASSERT_WORKTOP_RESOURCES_INCLUDE to attach bounds to certain resources, but allow other resources to be present, and then deposit Expression("ENTIRE_WORKTOP") 1) Transfer This type is when assets are being transferred from a user’s account directly to one or more other accounts, without any other component calls. Commonly this transaction type would be generated internal to the wallet itself with its own transfer feature, although it certainly could be generated by an external dApp. Allowed transaction manifest instructions: - Any number of withdrawals from a single account component (typically one owned by the user) - Any number of deposits to any number of account components, all of defined quantity That is all - meaning, among other things, that no non-account component calls are allowed. With this type, the wallet is able to show a summary in an understandable style in terms of a “sender” and the assets to be received by one or more recipients, like this: 1a) Simple Transfer This is a sub-set of the Transfer type, intended specifically to allow very simple (but very common) 1-to-1 transfers of a single type of asset to be understood by Ledger hardware wallet devices. The treatment in the Radix Wallet is identical to the more general Transfer, but it has a more limited set of allowed transaction manifest commands for Ledger devices: - A single withdrawal from an account component (typically one owned by the user) - either a withdrawal of a fungible resource or some number of non-fungibles of the same resource. - A single create bucket by amount (whether fungible or non-fungible) - A single deposit of that bucket to an account component Again, no non-account component calls. In this case, the Ledger device can show a summary with a simple To/From/Resource+Amount format. More complicated transfers will be treated by Ledger devices in the same way as other general transactions. 2) Contribute to Pool Allowed transaction manifest instructions: - Bucket creations - Account methods to lock fee, non-refund deposit methods, and the "withdraw" method - The "contribute" method on pools - Resource assertions 3) Redeem from Pool Allowed transaction manifest instructions: - Bucket creations - Account methods to lock fee, non-refund deposit methods, and the "withdraw" method - The "redeem" method on pools - Resource assertions 4) Stake to Validator This type is when a user is directly staking XRD to one or more validator components, receiving liquid stake unit tokens as a result. Allowed transaction manifest instructions: - An ordered set of calls that withdraw a quantity of XRD from a user account, pass the full quantity of those XRD to a validator component’s stake method, and deposit the resulting liquid stake unit tokens back to that same account. - One or more sets of the above, to support multiple stakes. With this type, the wallet is able to show a summary of a set of stakes in a “staking X from account A, to validator B” style. 5) Unstake from Validator This type is when a user is requesting an unstake of XRD from one or more validator components, receiving claim NFT(s) as a result. Allowed transaction manifest instructions: - An ordered set of calls that withdraw a quantity of LSU tokens from a user account, pass the full quantity of LSUs to a validator component’s unstake method, and deposit the resulting claim NFT back to that same account. - One or more set of the above, to support multiple unstakes. With this type, the wallet is able to show a summary of a set of unstakes in a “requesting unstake of X from validator A, to account B” style. 6) Claim Stake from Validator This type is when a user is requesting an claim of unstaked XRD from one or more validator components. Allowed transaction manifest instructions: - An ordered set of calls that withdraw a single claim NFT from a user account, pass it to a validator component’s claim method, and deposit the resulting XRD back to that same account. - One or more set of the above, to support multiple claims. With this type, the wallet is able to show a summary of a set of claims in a “claiming X from validator A, to account B” style. 7) Update Account Deposit Settings This type is when a user is updating the configuration settings of one or more of their own accounts (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) that control if/how third-parties are able to deposit assets to them. Allowed transaction manifest instructions: - One or more calls to set_default_deposit_rule (choosing to set Accept, Reject, or AllowExisting) - One or more calls to set_resource_preference (choosing to set Allowed or Disallowed) or remove_resource_preference - One or more calls to add_authorized_depositor or remove_authorized_depositor With this type, the wallet is able to show a description of settings to be changed, on a per-account basis. For example… Possible future conforming types - Deploy package - Update metadata - Update validator settings - Claim component or package royalties - Burn a held asset - Coming with MFA support in the wallet, Access Controller actions, such as updating the account’s security configuration, recovery, locking and unlocking. These will only be usable by the wallet, and won’t be able to be created by dApps. ## Manifest Instructions URL: https://radix.wiki/developers/legacy-docs/reference/transactions/manifest/manifest-instructions Updated: 2026-02-18 Summary: Radix transaction manifest adopts a bash-like grammar. Each manifest consists of a sequence of instructions, each of which contains. Reference > Transactions > Manifest > Manifest Instructions — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) Notes - Addresses on this page are based on the simulator network definition. - All examples should be compilable, using the rtmc CLI. If you spot a broken example, please reach out to the support team on Discord. Grammar Radix transaction manifest adopts a bash-like grammar. Each manifest consists of a sequence of instructions, each of which contains. - A command for the type of operation - Zero or more arguments, in manifest value syntax (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/manifest-value-syntax.md) . - A semicolon Instruction List The table below shows all the instructions that all currently supported by Radix Engine. Many instructions, such as CALL_METHOD and CALL_FUNCTION take arbitrary values - these values can be expressed in manifest value syntax (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/manifest-value-syntax.md) . Some instructions are marked as Added in V2. These transactions are only available when building manifests for a TransactionManifestV2 or SubintentManifestV2, introduced in Cuttlefish. At Cuttlefish launch, the wallet will only support TransactionManifestV1. It will however support SubintentManifestV2 in pre-authorization requests. Bucket Lifecyle TAKE_FROM_WORKTOP Creates a named bucket with the specified amount of resource on the worktop. It errors if there is insufficient resource available. If the exact amount of resource is known, this command should be preferred to TAKE_ALL_FROM_WORKTOP or Expression("ENTIRE_WORKTOP") because it gives clearer static guarantees to the user. If the exact non-fungible ids are known, TAKE_NON_FUNGIBLES_FROM_WORKTOP should be used instead, as it is even more specific. Example TAKE_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("1.0") Bucket("xrd_bucket") ; TAKE_NON_FUNGIBLES_FROM_WORKTOP Creates a named bucket with the specified non-fungibles on the worktop. It errors if these are not present on the worktop. If the exact non-fungibles are known, this command should be preferred to TAKE_FROM_WORKTOP, TAKE_ALL_FROM_WORKTOP or Expression("ENTIRE_WORKTOP") because it gives clearer static guarantees to the user. Example TAKE_NON_FUNGIBLES_FROM_WORKTOP Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Array(NonFungibleLocalId("#1#"), NonFungibleLocalId("#2#")) Bucket("nfts") ; TAKE_ALL_FROM_WORKTOP Creates a named bucket from all of the given resource currently on the worktop. If the exact non-fungibles are known, prefer TAKE_NON_FUNGIBLES_FROM_WORKTOP and if the exact balance is known, prefer TAKE_FROM_WORKTOP as these give clearer static guarantees to the user. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Bucket("xrd_bucket") ; RETURN_TO_WORKTOP Consumes a named bucket, returning all of its contents to the worktop. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Bucket("xrd_bucket") ; RETURN_TO_WORKTOP Bucket("xrd_bucket") ; BURN_RESOURCE Consumes a named bucket, burning all of its contents. This errors if the worktop’s auth zone does not provide authorization for the burn action. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Bucket("xrd_bucket") ; BURN_RESOURCE Bucket("xrd_bucket") ; Resource Assertions ASSERT_WORKTOP_CONTAINS_ANY Verifies that the worktop contains any non-zero amount of the given fungible or non-fungible resource. Example ASSERT_WORKTOP_CONTAINS_ANY Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") ; ASSERT_WORKTOP_CONTAINS Verifies that the worktop contains at least the given (non-zero) amount of the fungible or non-fungible resource, else aborts the transaction. Example ASSERT_WORKTOP_CONTAINS Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("1.0") ; ASSERT_WORKTOP_CONTAINS_NON_FUNGIBLES Verifies that the worktop contains the specified non-fungibles, else aborts the transaction. Example ASSERT_WORKTOP_CONTAINS_NON_FUNGIBLES Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Array(NonFungibleLocalId("#1#"), NonFungibleLocalId("#2#")) ; ASSERT_WORKTOP_RESOURCES_ONLY (Added in V2) Asserts that the worktop’s balance of the specified resources matches the given constraints, and also ensures that all unspecified resources have zero balance. Use ASSERT_WORKTOP_RESOURCES_INCLUDE instead if you want to allow balances of other unspecified resources. To construct constraints, you may wish to see the definition of ManifestResourceConstraint and GeneralResourceConstraint defined here (https://github.com/radixdlt/radixdlt-scrypto/blob/develop/radix-common/src/data/manifest/model/manifest_resource_assertion.rs) . Example ASSERT_WORKTOP_RESOURCES_ONLY Map( Address("${resource_address}") => Enum(), ) ; ASSERT_WORKTOP_IS_EMPTY (Added in V2 | alias) Asserts that the worktop contains no balance of any resource. This is an alias for ASSERT_WORKTOP_RESOURCES_ONLY with an empty map. Example ASSERT_WORKTOP_IS_EMPTY; ASSERT_WORKTOP_RESOURCES_INCLUDE (Added in V2) Asserts that the worktop’s balance of the specified resources matches the given constraints, but permits the worktop to contain non-zero balances of other unspecified resources. Use ASSERT_WORKTOP_RESOURCES_ONLY instead if you want to disallow other unspecified resource balances. To construct constraints, you may wish to see the definition of ManifestResourceConstraint and GeneralResourceConstraint defined here (https://github.com/radixdlt/radixdlt-scrypto/blob/develop/radix-common/src/data/manifest/model/manifest_resource_assertion.rs) . Example ASSERT_WORKTOP_RESOURCES_INCLUDE Map( Address("${fungible_resource_address}") => Enum( Decimal("1") ), Address("${non_fungible_resource_address}") => Enum( Decimal("2") ), ) ; ASSERT_NEXT_CALL_RETURNS_ONLY (Added in V2) Asserts that the following invocation instruction must return buckets whose balance of the specified resources matches the given constraints, and also ensures that all unspecified resources have zero balance. An invocation instruction is one of the instructions starting YIELD or CALL or one of the CALL aliases. The invocation immediately follow this instruction. Use ASSERT_NEXT_CALL_RETURNS_INCLUDE instead if you want to allow other unspecified resource balances. To construct constraints, you may wish to see the definition of ManifestResourceConstraint and GeneralResourceConstraint defined here (https://github.com/radixdlt/radixdlt-scrypto/blob/develop/radix-common/src/data/manifest/model/manifest_resource_assertion.rs) . Example ASSERT_NEXT_CALL_RETURNS_ONLY Map( Address("${non_fungible_resource_address}") => Enum( Array( NonFungibleLocalId("#234#") ) ), ) ; ASSERT_NEXT_CALL_RETURNS_INCLUDE (Added in V2) Asserts that the following invocation instruction must return buckets whose balance of the specified resources matches the given constraints, but permits non-zero balances of other unspecified resources. An invocation instruction is one of the instructions starting YIELD or CALL or one of the CALL aliases. The invocation immediately follow this instruction. Use ASSERT_NEXT_CALL_RETURNS_ONLY instead if you want to disallow other unspecified resource balances. To construct constraints, you may wish to see the definition of ManifestResourceConstraint and GeneralResourceConstraint defined here (https://github.com/radixdlt/radixdlt-scrypto/blob/develop/radix-common/src/data/manifest/model/manifest_resource_assertion.rs) . Example ASSERT_NEXT_CALL_RETURNS_INCLUDE Map( Address("${non_fungible_resource_address}") => Enum( Array( NonFungibleLocalId("") ) ), ) ; ASSERT_BUCKET_CONTENTS (Added in V2) Asserts that the named bucket’s contents matches the given constraints. To construct constraints, you may wish to see the definition of ManifestResourceConstraint and GeneralResourceConstraint defined here (https://github.com/radixdlt/radixdlt-scrypto/blob/develop/radix-common/src/data/manifest/model/manifest_resource_assertion.rs) . Example ASSERT_BUCKET_CONTENTS Bucket("bucket") Enum( # ManifestResourceConstraint Tuple( # GeneralResourceConstraint Array(), # - required_ids Enum(), # - lower_bound Enum( # - upper_bound Decimal("123") ), Enum() # - allowed_ids ) ) ; Proof Lifecycle CREATE_PROOF_FROM_BUCKET_OF_AMOUNT Creates a proof of some amount from a bucket. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Bucket("xrd_bucket") ; CREATE_PROOF_FROM_BUCKET_OF_AMOUNT Bucket("xrd_bucket") Decimal("1.0") Proof("proof") ; CREATE_PROOF_FROM_BUCKET_OF_NON_FUNGIBLES Creates a proof of some non-fungibles from a bucket. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Bucket("nfts") ; CREATE_PROOF_FROM_BUCKET_OF_NON_FUNGIBLES Bucket("nfts") Array(NonFungibleLocalId("#123#")) Proof("proof1b") ; CREATE_PROOF_FROM_BUCKET_OF_ALL Create a proof with all the resource in a bucket. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Bucket("nfts") ; CREATE_PROOF_FROM_BUCKET_OF_ALL Bucket("nfts") Proof("proof") ; CREATE_PROOF_FROM_AUTH_ZONE_OF_AMOUNT Creates a proof of some amount of resource from the auth zone. Example CREATE_PROOF_FROM_AUTH_ZONE_OF_AMOUNT Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("1.0") Proof("proof") ; CREATE_PROOF_FROM_AUTH_ZONE_OF_NON_FUNGIBLES Creates a proof of some non-fungibles from the auth zone. Example CREATE_PROOF_FROM_AUTH_ZONE_OF_NON_FUNGIBLES Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Array(NonFungibleLocalId("#123#")) Proof("proof") ; CREATE_PROOF_FROM_AUTH_ZONE_OF_ALL Creates the max proof of some resource from the auth zone. Example CREATE_PROOF_FROM_AUTH_ZONE_OF_ALL Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Proof("proof"); CLONE_PROOF Clones a proof. Example POP_FROM_AUTH_ZONE Proof("proof") ; CLONE_PROOF Proof("proof") Proof("cloned_proof") ; DROP_PROOF Drops a proof. This allows resources locked by the proof to be removed if all proofs against those resources are dropped. Example POP_FROM_AUTH_ZONE Proof("proof") ; DROP_PROOF Proof("proof") ; PUSH_TO_AUTH_ZONE Pushes a proof to the auth zone. Example POP_FROM_AUTH_ZONE Proof("proof") ; PUSH_TO_AUTH_ZONE Proof("proof") ; POP_FROM_AUTH_ZONE Pops the most recent proof from the auth zone. Example POP_FROM_AUTH_ZONE Proof("proof") ; DROP_AUTH_ZONE_PROOFS Removes all proofs in the auth zone. Example DROP_AUTH_ZONE_PROOFS ; DROP_AUTH_ZONE_REGULAR_PROOFS Removes all regular (non-signature) proofs in the auth zone. Example DROP_AUTH_ZONE_REGULAR_PROOFS ; DROP_AUTH_ZONE_SIGNATURE_PROOFS Removes all signature proofs in the auth zone. Example DROP_AUTH_ZONE_SIGNATURE_PROOFS ; DROP_NAMED_PROOFS Drops all of the named proofs. Example DROP_NAMED_PROOFS ; DROP_ALL_PROOFS Drops all of the proofs in the auth zone and the named proofs. Example DROP_ALL_PROOFS ; Invocations CALL_FUNCTION Invokes a function on a blueprint. Example CALL_FUNCTION Address("package_sim1p4nk9h5kw2mcmwn5u2xcmlmwap8j6dzet7w7zztzz55p70rgqs4vag") "Hello" "instantiate_hello" 66u32 ; CREATE_ACCOUNT (alias) Create a native account component. This is an alias for CALL_FUNCTION to the account blueprint "create" function. Example CREATE_ACCOUNT ; CREATE_ACCOUNT_ADVANCED (alias) Create a native account component with an OwnerRole configuration. This is an alias for CALL_FUNCTION to the account blueprint "create_advanced" function. Example CREATE_ACCOUNT_ADVANCED Enum( Enum() ) None ; CREATE_ACCESS_CONTROLLER (alias) Create an access controller native component. This is an alias for CALL_FUNCTION to the access controller blueprint "create" function. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Bucket("bucket1") ; CREATE_ACCESS_CONTROLLER Bucket("bucket1") Tuple( Enum<1u8>(), # primary role Enum<1u8>(), # recovery role Enum<1u8>() # confirmation role ) None # timed recovery delay in minutes None # address reservation ; CREATE_FUNGIBLE_RESOURCE (alias) Create a fungible resource and deposit it on the worktop. This is an alias for a CALL_FUNCTION to the fungible resource blueprint’s "create" function. Example CREATE_FUNGIBLE_RESOURCE # Owner role - This gets metadata permissions, and is the default for other permissions # Can set as Enum(access_rule) or Enum(access_rule) Enum() true # Whether the engine should track supply (avoid for massively parallelizable tokens) 18u8 # Divisibility (between 0u8 and 18u8) Tuple( Some( # Mint Roles (if None: defaults to DenyAll, DenyAll) Tuple( Some(Enum()), # Minter (if None: defaults to Owner) Some(Enum()) # Minter Updater (if None: defaults to Owner) ) ), None, # Burn Roles (if None: defaults to DenyAll, DenyAll) None, # Freeze Roles (if None: defaults to DenyAll, DenyAll) None, # Recall Roles (if None: defaults to DenyAll, DenyAll) None, # Withdraw Roles (if None: defaults to AllowAll, DenyAll) None # Deposit Roles (if None: defaults to AllowAll, DenyAll) ) Tuple( # Metadata initialization Map( # Initial metadata values "name" => Tuple( Some(Enum("MyResource")), # Resource Name true # Locked ) ), Map( # Metadata roles "metadata_setter" => Some(Enum()), # Metadata setter role "metadata_setter_updater" => None, # Metadata setter updater role as None defaults to OWNER "metadata_locker" => Some(Enum()), # Metadata locker role "metadata_locker_updater" => None # Metadata locker updater role as None defaults to OWNER ) ) None # No Address Reservation ; CREATE_FUNGIBLE_RESOURCE_WITH_INITIAL_SUPPLY (alias) Create a resource with a specified badge for authorization. This is an alias for a CALL_FUNCTION to the fungible resource blueprint "create_with_initial_supply" function. Example CREATE_FUNGIBLE_RESOURCE_WITH_INITIAL_SUPPLY # Owner role - This gets metadata permissions, and is the default for other permissions # Can set as Enum(access_rule) or Enum(access_rule) Enum() true # Whether the engine should track supply (avoid for massively parallelizable tokens) 18u8 # Divisibility (between 0u8 and 18u8) Decimal("10000") # Initial supply Tuple( Some( # Mint Roles (if None: defaults to DenyAll, DenyAll) Tuple( Some(Enum()), # Minter (if None: defaults to Owner) Some(Enum()) # Minter Updater (if None: defaults to Owner) ) ), None, # Burn Roles (if None: defaults to DenyAll, DenyAll) None, # Freeze Roles (if None: defaults to DenyAll, DenyAll) None, # Recall Roles (if None: defaults to DenyAll, DenyAll) None, # Withdraw Roles (if None: defaults to AllowAll, DenyAll) None # Deposit Roles (if None: defaults to AllowAll, DenyAll) ) Tuple( # Metadata initialization Map( # Initial metadata values "name" => Tuple( Some(Enum("MyResource")), # Resource Name true # Locked ) ), Map( # Metadata roles "metadata_setter" => Some(Enum()), # Metadata setter role "metadata_setter_updater" => None, # Metadata setter updater role as None defaults to OWNER "metadata_locker" => Some(Enum()), # Metadata locker role "metadata_locker_updater" => None # Metadata locker updater role as None defaults to OWNER ) ) None # No Address Reservation ; CREATE_NON_FUNGIBLE_RESOURCE (alias) Create a non-fungible resource from the worktop. This is an alias for CALL_FUNCTION to the non fungible resource blueprint’s "create" function. Example CREATE_NON_FUNGIBLE_RESOURCE # Owner role - This gets metadata permissions, and is the default for other permissions # Can set as Enum(access_rule) or Enum(access_rule) Enum() Enum() # The type of NonFungible Id true # Whether the engine should track supply (avoid for massively parallelizable tokens) Enum<0u8>(Enum<0u8>(Tuple(Array(), Array(), Array())), Enum<0u8>(66u8), Array()) # Non Fungible Data Schema Tuple( Some( # Mint Roles (if None: defaults to DenyAll, DenyAll) Tuple( Some(Enum()), # Minter (if None: defaults to Owner) Some(Enum()) # Minter Updater (if None: defaults to Owner) ) ), None, # Burn Roles (if None: defaults to DenyAll, DenyAll) None, # Freeze Roles (if None: defaults to DenyAll, DenyAll) None, # Recall Roles (if None: defaults to DenyAll, DenyAll) None, # Withdraw Roles (if None: defaults to AllowAll, DenyAll) None, # Deposit Roles (if None: defaults to AllowAll, DenyAll) None # Non Fungible Data Update Roles (if None: defaults to DenyAll, DenyAll) ) Tuple( # Metadata initialization Map( # Initial metadata values "name" => Tuple( Some(Enum("MyResource")), # Resource Name true # Locked ) ), Map( # Metadata roles "metadata_setter" => Some(Enum()), # Metadata setter role "metadata_setter_updater" => None, # Metadata setter updater role as None defaults to OWNER "metadata_locker" => Some(Enum()), # Metadata locker role "metadata_locker_updater" => None # Metadata locker updater role as None defaults to OWNER ) ) None; # No Address Reservation CREATE_NON_FUNGIBLE_RESOURCE_WITH_INITIAL_SUPPLY (alias) Create a non-fungible resource with a specified badge for authorization. This is an alias for CALL_FUNCTION to the non fungible resource blueprint’s "create_with_initial_supply" function. Example CREATE_NON_FUNGIBLE_RESOURCE_WITH_INITIAL_SUPPLY # Owner role - This gets metadata permissions, and is the default for other permissions # Can set as Enum(access_rule) or Enum(access_rule) Enum() Enum() # The type of NonFungible Id true # Whether the engine should track supply (avoid for massively parallelizable tokens) Enum<0u8>(Enum<0u8>(Tuple(Array(), Array(), Array())), Enum<0u8>(66u8), Array()) # Non Fungible Data Schema Map( # Initial supply to mint NonFungibleLocalId("#1#") => Tuple(Tuple()) ) Tuple( Some( # Mint Roles (if None: defaults to DenyAll, DenyAll) Tuple( Some(Enum()), # Minter (if None: defaults to Owner) Some(Enum()) # Minter Updater (if None: defaults to Owner) ) ), None, # Burn Roles (if None: defaults to DenyAll, DenyAll) None, # Freeze Roles (if None: defaults to DenyAll, DenyAll) None, # Recall Roles (if None: defaults to DenyAll, DenyAll) None, # Withdraw Roles (if None: defaults to AllowAll, DenyAll) None, # Deposit Roles (if None: defaults to AllowAll, DenyAll) None # Non Fungible Data Update Roles (if None: defaults to DenyAll, DenyAll) ) Tuple( # Metadata initialization Map( # Initial metadata values "name" => Tuple( Some(Enum("MyResource")), # Resource Name true # Locked ) ), Map( # Metadata roles "metadata_setter" => Some(Enum()), # Metadata setter role "metadata_setter_updater" => None, # Metadata setter updater role as None defaults to OWNER "metadata_locker" => Some(Enum()), # Metadata locker role "metadata_locker_updater" => None # Metadata locker updater role as None defaults to OWNER ) ) None # No Address Reservation ; CREATE_IDENTITY (alias) Create a native Identity component. This is an alias for CALL_FUNCTION to the identity blueprint’s "create" function. Example CREATE_IDENTITY ; CREATE_IDENTITY_ADVANCED (alias) Create a native Identity component with an OwnerRole configuration. This is an alias for CALL_FUNCTION to the identity blueprint’s "create_advanced" function. Example CREATE_IDENTITY_ADVANCED Enum() ; CREATE_VALIDATOR (alias) Create a native validator component. This is an alias for CALL_FUNCTION to the identity blueprint’s "create" function. Example TAKE_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("1000") Bucket("bucket1") ; CREATE_VALIDATOR Bytes("02c6047f9441ed7d6d3045406e95c07cd85c778e4b8cef3ca7abac09b95c709ee5") Decimal("1") Bucket("bucket1") ; PUBLISH_PACKAGE (alias) Publishes a package, giving a package owner badge defined in the owner role. It’s generally recommended to instead use PUBLISH_PACKAGE_ADVANCED for more control. This is an alias for CALL_FUNCTION to the package blueprint’s "publish_wasm" function. Example PUBLISH_PACKAGE Tuple( # package definition Map( "Hello" => Tuple( Enum<0u8>(), false, Array(), Array
(), Tuple( Array(), Enum<0u8>( Tuple( Array( Enum<14u8>( Array( Enum<0u8>( 167u8 ) ) ), Enum<14u8>( Array() ), Enum<17u8>( Enum<0u8>() ), Enum<14u8>( Array() ) ), Array( Tuple( Enum<1u8>( "Hello" ), Enum<1u8>( Enum<0u8>( Array( "sample_vault" ) ) ) ), Tuple( Enum<1u8>( "Hello_instantiate_hello_Input" ), Enum<1u8>( Enum<0u8>( Array() ) ) ), Tuple( Enum<1u8>( "GlobalHello" ), Enum<0u8>() ), Tuple( Enum<1u8>( "Hello_free_token_Input" ), Enum<1u8>( Enum<0u8>( Array() ) ) ) ), Array( Enum<0u8>(), Enum<0u8>(), Enum<14u8>( Enum<0u8>( Enum<4u8>( Enum<0u8>(), "Hello" ) ) ), Enum<0u8>() ) ) ), Tuple( Array( Tuple( Enum<0u8>( Enum<1u8>( 0u64 ) ), Enum<0u8>(), Enum<0u8>() ) ), Array() ), Map(), Map(), Tuple( Map( "instantiate_hello" => Tuple( Enum<0u8>(), Enum<0u8>( Enum<1u8>( 1u64 ) ), Enum<0u8>( Enum<1u8>( 2u64 ) ), "Hello_instantiate_hello" ), "free_token" => Tuple( Enum<1u8>( Tuple( Enum<1u8>(), Tuple( 1u32 ) ) ), Enum<0u8>( Enum<1u8>( 3u64 ) ), Enum<0u8>( Enum<0u8>( 161u8 ) ), "Hello_free_token" ) ) ), Tuple( Map() ) ), Enum<0u8>(), Tuple( Enum<0u8>(), Enum<0u8>() ) ) ) ) Blob("4cdfc89a539f1cb6fac327c07e95a2f120ea5af6d2e2adfae093f81e1d185925") # code Map() # metadata ; PUBLISH_PACKAGE_ADVANCED (alias) Publishes a package. This is an alias for CALL_FUNCTION to the package blueprint’s "publish_wasm_advanced" function. Example PUBLISH_PACKAGE_ADVANCED Enum<1u8>( # owner rule Enum<2u8>( Enum<0u8>( Enum<0u8>( Enum<0u8>( NonFungibleGlobalId("resource_sim1nfzf2h73frult99zd060vfcml5kncq3mxpthusm9lkglvhsr0guahy:#1#") ) ) ) ) ) Tuple( # package definition Map( "Hello" => Tuple( Enum<0u8>(), false, Array(), Array
(), Tuple( Array(), Enum<0u8>( Tuple( Array( Enum<14u8>( Array( Enum<0u8>( 167u8 ) ) ), Enum<14u8>( Array() ), Enum<17u8>( Enum<0u8>() ), Enum<14u8>( Array() ) ), Array( Tuple( Enum<1u8>( "Hello" ), Enum<1u8>( Enum<0u8>( Array( "sample_vault" ) ) ) ), Tuple( Enum<1u8>( "Hello_instantiate_hello_Input" ), Enum<1u8>( Enum<0u8>( Array() ) ) ), Tuple( Enum<1u8>( "GlobalHello" ), Enum<0u8>() ), Tuple( Enum<1u8>( "Hello_free_token_Input" ), Enum<1u8>( Enum<0u8>( Array() ) ) ) ), Array( Enum<0u8>(), Enum<0u8>(), Enum<14u8>( Enum<0u8>( Enum<4u8>( Enum<0u8>(), "Hello" ) ) ), Enum<0u8>() ) ) ), Tuple( Array( Tuple( Enum<0u8>( Enum<1u8>( 0u64 ) ), Enum<0u8>(), Enum<0u8>() ) ), Array() ), Map(), Map(), Tuple( Map( "instantiate_hello" => Tuple( Enum<0u8>(), Enum<0u8>( Enum<1u8>( 1u64 ) ), Enum<0u8>( Enum<1u8>( 2u64 ) ), "Hello_instantiate_hello" ), "free_token" => Tuple( Enum<1u8>( Tuple( Enum<1u8>(), Tuple( 1u32 ) ) ), Enum<0u8>( Enum<1u8>( 3u64 ) ), Enum<0u8>( Enum<0u8>( 161u8 ) ), "Hello_free_token" ) ) ), Tuple( Map() ) ), Enum<0u8>(), Tuple( Enum<0u8>(), Enum<0u8>() ) ) ) ) Blob("4cdfc89a539f1cb6fac327c07e95a2f120ea5af6d2e2adfae093f81e1d185925") # code Map() # metadata Enum<0u8>() # optional address reservation ; CALL_METHOD Invokes a method on the main module of a component. See native blueprints (#native-blueprints) below for more typical methods. Example TAKE_ALL_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Bucket("xrd_bucket") ; CALL_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "buy_gumball" Bucket("xrd_bucket") ; CALL_ROYALTY_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "set_royalty" "my_method" Enum<0u8>() ; CALL_METADATA_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "get" "HelloWorld" ; CALL_ROLE_ASSIGNMENT_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "get" Enum<0u8>() "hello" ; MINT_FUNGIBLE (alias) Mints a fungible resource and deposits it on the worktop. This is an alias for CALL_METHOD to a fungible resource’s "mint" method. Example MINT_FUNGIBLE Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("1.0") ; MINT_NON_FUNGIBLE (alias) Mint a non-fungible resource and deposit it on the worktop. This is an alias for CALL_METHOD to a non-fungible resource’s "mint" method. Example # This non-fungible token has no data MINT_NON_FUNGIBLE Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Map( NonFungibleLocalId("#555#") => Tuple(Tuple()) ) ; # This non-fungible token has 2 String data fields and 1 Decimal MINT_NON_FUNGIBLE Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Map( NonFungibleLocalId("") => Tuple( Tuple( "Hello", "World", Decimal("12"), ) ) ) ; MINT_RUID_NON_FUNGIBLE (alias) Mint non-fungible resource with a random Radix Unique Identifier (RUID). Using this instruction, you only have to provide the data. A random UUID will be attached to each entry. This is an alias for CALL_METHOD to a non-fungible resource’s "mint_ruid" method. Example # This non-fungible token has 2 data fields, "Hello World" and Decimal("12") MINT_RUID_NON_FUNGIBLE Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Array( Tuple(Tuple("Hello World", Decimal("12"))) ) ; CLAIM_PACKAGE_ROYALTIES (alias) Claims royalty on an owned package This is an alias for CALL_METHOD to the package blueprint’s "PackageRoyalty_claim_royalties" method. Example CLAIM_PACKAGE_ROYALTIES Address("package_sim1p4nk9h5kw2mcmwn5u2xcmlmwap8j6dzet7w7zztzz55p70rgqs4vag") ; Learn more about Royalties (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/royalties/using-royalties.md) . CALL_ROYALTY_METHOD Invokes a method on the royalty module of a component. Typically you use an alias instruction for this instead (see following instructions). Example CALL_ROYALTY_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "set_royalty" "my_method" Enum<0u8>() ; SET_COMPONENT_ROYALTY (alias) Sets the royalty configuration on a component. This is an alias for CALL_ROYALTY_METHOD "set_royalty". Example SET_COMPONENT_ROYALTY Address("component_sim1czawvqpwnuwew7zfnsflhyxgmaf4hckjawlfku6jfpet9efa4aeskg") "my_method" Enum() ; CLAIM_COMPONENT_ROYALTIES (alias) Claims royalty on an owned component. This is an alias for CALL_ROYALTY_METHOD "claim_royalties". Example CLAIM_COMPONENT_ROYALTIES Address("component_sim1czawvqpwnuwew7zfnsflhyxgmaf4hckjawlfku6jfpet9efa4aeskg") ; Learn more about Royalties (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/royalties/using-royalties.md) . LOCK_COMPONENT_ROYALTY (alias) Locks the royalty configuration on a component. This is an alias for CALL_ROYALTY_METHOD "lock_royalty". Example LOCK_COMPONENT_ROYALTY Address("component_sim1czawvqpwnuwew7zfnsflhyxgmaf4hckjawlfku6jfpet9efa4aeskg") "my_method" ; CALL_METADATA_METHOD Invokes a method on the metadata module of a component. Typically you use an alias instruction for this instead (see following instructions). Example CALL_METADATA_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "get" "HelloWorld" ; SET_METADATA (alias) Sets a metadata entry of a global entity. This is an alias for CALL_METADATA_METHOD "set". Example # Set String Metadata on Package SET_METADATA Address("package_sim1p4nk9h5kw2mcmwn5u2xcmlmwap8j6dzet7w7zztzz55p70rgqs4vag") "field_name" # "Metadata::String" is equivalent to 0u8 Enum( "Metadata string value, eg description" ) ; # Set String Metadata on Account component SET_METADATA Address("account_sim1c8zvh2cd59pnwfqkkre0rflfhshrhptkweyf7vzefw08u0l3mk2anh") "field_name" # "Metadata::String" is equivalent to 0u8 Enum( "Metadata string value, eg description" ) ; # Set String Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::String" is equivalent to 0u8 Enum( "Metadata string value, eg description" ) ; # Set Bool Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::Bool" is equivalent to 1u8 Enum( true ) ; # Set u8 Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::U8" is equivalent to 2u8 Enum( 123u8 ) ; # Set u32 Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::U32" is equivalent to 3u8 Enum( 123u32 ) ; # Set u64 Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::U64" is equivalent to 4u8 Enum( 123u64 ) ; # Set i32 Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::I32" is equivalent to 5u8 Enum( -123i32 ) ; # Set i64 Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::I64" is equivalent to 6u8 Enum( -123i64 ) ; # Set Decimal Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::Decimal" is equivalent to 7u8 Enum( # Single item Decimal("10.5") ) ; # Set Address Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::Address" is equivalent to 8u8 Enum( Address("account_sim1c8zvh2cd59pnwfqkkre0rflfhshrhptkweyf7vzefw08u0l3mk2anh") ) ; # Set Public Key Metadata on Resource # NOTE: Also see "PublicKeyHash" further down SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::PublicKey" is equivalent to 9u8 Enum( Enum( # 0u8 = Secp256k1, 1u8 = Ed25519 # Hex-encoded canonical-Radix encoding of the public key Bytes("0000000000000000000000000000000000000000000000000000000000000000ff") ) ) ; # Set NonFungibleGlobalId Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::NonFungibleGlobalId" is equivalent to 10u8 Enum( NonFungibleGlobalId("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj:") ) ; # Set NonFungibleLocalId Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::NonFungibleLocalId" is equivalent to 11u8 Enum( NonFungibleLocalId("") ) ; # Set Instant (or the value in seconds since unix epoch) Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::Instant" is equivalent to 12u8 Enum( # Value in seconds since Unix Epoch 10000i64 ) ; # Set Url Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::Url" is equivalent to 13u8 Enum( # Single item "https://radixdlt.com/index.html" ) ; # Set Origin Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::Origin" is equivalent to 14u8 Enum( "https://radixdlt.com" ) ; # Set PublicKeyHash Metadata on Resource SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::PublicKeyHash" is equivalent to 15u8 Enum( Enum( # 0u8 = Secp256k1, 1u8 = Ed25519 # The hex-encoded final 29 bytes of the Blake2b-256 hash of the public key bytes (in the canonical Radix encoding) Bytes("0000000000000000000000000000000000000000000000000000000000") ) ) ; # Setting list-based metadata: # ============================ # If using enum discriminator aliases: Take "Metadata::X" and add Array to the end, eg "Metadata::XArray" # If using u8 enum discriminators: Add 128 to the single values # # Then just make the content an Array. # # For example, for strings: SET_METADATA Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") "field_name" # "Metadata::StringArray" is equivalent to 128u8 Enum( Array( "some_string", "another_string", "yet_another_string" ) ) ; REMOVE_METADATA (alias) Removes the metadata key and value of a global entity. This is an alias for CALL_METADATA_METHOD "remove". Example REMOVE_METADATA Address("package_sim1p4nk9h5kw2mcmwn5u2xcmlmwap8j6dzet7w7zztzz55p70rgqs4vag") "field_name"; REMOVE_METADATA Address("account_sim1c8zvh2cd59pnwfqkkre0rflfhshrhptkweyf7vzefw08u0l3mk2anh") "field_name"; REMOVE_METADATA Address("resource_sim1nfnwxdfw2tjzzkvyj9kxe64en6nhnelqmrxwrmgnly4a4wugc6e6cu") "field_name"; LOCK_METADATA (alias) Locks a metadata entry of a global entity, preventing it from being updated. This is an alias for CALL_METADATA_METHOD "lock". Example LOCK_METADATA Address("package_sim1p4nk9h5kw2mcmwn5u2xcmlmwap8j6dzet7w7zztzz55p70rgqs4vag") "field_name"; LOCK_METADATA Address("account_sim1c8zvh2cd59pnwfqkkre0rflfhshrhptkweyf7vzefw08u0l3mk2anh") "field_name"; LOCK_METADATA Address("resource_sim1nfnwxdfw2tjzzkvyj9kxe64en6nhnelqmrxwrmgnly4a4wugc6e6cu") "field_name"; CALL_ROLE_ASSIGNMENT_METHOD Invokes a method on the role assignment module of a component. Typically you use an alias instruction for this instead (see following instructions). Example CALL_ROLE_ASSIGNMENT_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "get" Enum<0u8>() "hello" ; SET_ROLE (alias) Sets the access rule for a specified role of a global entity. This is an alias for CALL_ROLE_ASSIGNMENT_METHOD "set". Example SET_ROLE Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") Enum() # Main, Metadata, Royalty or RoleAssignment. You typically want "ModuleId::Main" here, unless setting a native role from the Metadata / Royalty modules. "role_name" # The name of the role to update the access rule for. Enum( # The access rule associated with the role Enum( Enum( Enum( # Either NonFungible or Resource, which contains either a NonFungibleGlobalId or a Address("") NonFungibleGlobalId("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj:#123#") ) ) ) ) ; SET_OWNER_ROLE (alias) Sets an OwnerRole configuration of a global entity. This is an alias for CALL_ROLE_ASSIGNMENT_METHOD "set_owner" Example SET_OWNER_ROLE Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Enum( Enum( Enum( Enum( # Either NonFungible or Resource, which contains either a NonFungibleGlobalId or a Address("") NonFungibleGlobalId("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj:#123#") ) ) ) ) ; LOCK_OWNER_ROLE (alias) Locks an OwnerRole configuration of a global entity. This is an alias for CALL_ROLE_ASSIGNMENT_METHOD "lock_owner". Example LOCK_OWNER_ROLE Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") ; CALL_DIRECT_VAULT_METHOD Invokes a direct access method on the given vault. Typically you use an alias instruction for this instead (see following instructions). RECALL_FROM_VAULT (alias) Recalls a fungible resource from a vault (if the resource has a recallable behavior). This is an alias for CALL_DIRECT_VAULT_METHOD "recall". Example RECALL_FROM_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Decimal("1.2") ; RECALL_NON_FUNGIBLES_FROM_VAULT (alias) Recalls a non-fungible resource from a vault (if the resource has a recallable behavior). This is an alias for CALL_DIRECT_VAULT_METHOD "recall_non_fungibles". Example RECALL_NON_FUNGIBLES_FROM_VAULT Address("internal_vault_sim1nzvf3xycnzvf3xycnzvf3xycnzvf3xycnzvf3xycnzvf3xyc4apn28") Array(NonFungibleLocalId("#123#"), NonFungibleLocalId("#456#")) ; FREEZE_VAULT (alias) FREEZE_VAULT manifest instruction has several configurations: - Freeze withdraws of a Vault. - Freeze deposits of a Vault. - Freeze resource burns of a Vault. - Freeze withdraws/deposits/burn resource(s) of a Vault. This is an alias for CALL_DIRECT_VAULT_METHOD "freeze". Example # Freeze Withdraws from a vault FREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(1u32) ; # Freeze Deposits into a vault FREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(2u32) ; # Freeze Burns in a vault FREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(4u32) ; # Freeze Withdraws/Deposits/Burns of a vault FREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(7u32) ; UNFREEZE_VAULT (alias) UNFREEZE_VAULT manifest instruction has several configurations: - Unfreeze withdraws of a Vault. - Unfreeze deposits of a Vault. - Unfreeze resource burns of a Vault. - Unfreeze withdraws/deposits/burn resource(s) of a Vault. This is an alias for CALL_DIRECT_VAULT_METHOD "unfreeze". Example # Unfreeze Withdraws from a vault UNFREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(1u32) ; # Unfreeze Deposits into a vault UNFREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(2u32) ; # Unfreeze Burns in a vault UNFREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(4u32) ; # Unfreeze Withdraws/Deposits/Burns of a vault UNFREEZE_VAULT Address("internal_vault_sim1tpv9skzctpv9skzctpv9skzctpv9skzctpv9skzctpv9skzcuxymgh") Tuple(7u32) ; Address Allocation USE_PREALLOCATED_ADDRESS (Pseudo-instruction | System only) This instruction can only be used by System transactions, such as genesis and protocol updates. It must appear at the start of the manifest, before all other instructions. It defines a pre-allocation of a particular address which can be used for instantiation later. Example USE_PREALLOCATED_ADDRESS Address("package_sim1pkgxxxxxxxxxresrcexxxxxxxxx000538436477xxxxxxxxxaj0zg9") "FungibleResourceManager" AddressReservation("reservation1") Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") ; ALLOCATE_GLOBAL_ADDRESS This command can be used to reserve an address, which is useful when creating new global entities (aka “object”) in the manifest and then wanting to interact with them later in the same transaction. The command takes: - The first two parameters are the package address and blueprint name, which scope the reserved address to be used by a new global entity of that type. - An entity can be a component (i.e. referencing a scrypto blueprint), or a resource or package (by referencing their native blueprint) - The third parameter is a name-binding for a newly created address reservation. This address reservation will need to be passed into a constructor of an object, and used in the globalize call. It has to be used by the time the transaction ends. - The fourth parameter is a name-binding for the newly created address. It can be used to call the newly created entity from the manifest after it’s been created. In Scrypto: - Constructors are recommended to take an Option to support this pattern. It’s also recommended they take an OwnerRole to be fully customisable. - The allocate address trick can be used in Scrypto with the Runtime::allocate_component_address(..) method. Example The following could be used to immediately call a function on an uploaded package in the same transaction. Note that the PUBLISH_PACKAGE_ADVANCED constructor includes the OwnerRole and Option parameters for full flexibility. ALLOCATE_GLOBAL_ADDRESS Address("${package_package_address}") "Package" AddressReservation("package_reservation") NamedAddress("my_package") ; PUBLISH_PACKAGE_ADVANCED Enum() # OwnerRole Tuple( # PackageDefinition Map() ) Blob("${code_blob_hash}") # Code (from blob) Map() # MetadataInit Some(AddressReservation("package_reservation")) # Option ; CALL_FUNCTION NamedAddress("my_package") "BlueprintName" "function_call" # Arguments... ; Interaction with other intents USE_CHILD (Added in V2 | Pseudo-instruction) Must appear at the start of the manifest, before all other instructions. Defines an explicit child subintent, and names it so that it can be used with YIELD_TO_CHILD later in the manifest. It is expected that the constructor of a given manifest knows concretely the contents of any children that they include, so the subintent hash is used to specify the child. For the manifest to be valid, the number of YIELD_TO_CHILD instructions targeting a given child must match the number of YIELD_TO_PARENT instructions in the child manifest. Example USE_CHILD NamedIntent("child_one") Intent("subtxid_sim1achf7hzm72jwhu7vqhuauypjdfnrnnkxnfazn0ue94mkaet5uz3q5g6m2t") ; YIELD_TO_PARENT (Added in V2 | Subintent only) Pauses or finishes execution of the subintent, and yields to the parent intent. Can also be used to transfer buckets to the parent. Every subintent must end with an explicit YIELD_TO_PARENT to finish execution and return to the parent. Example YIELD_TO_PARENT; YIELD_TO_PARENT Bucket("my_bucket") Bucket("my_other_bucket"); YIELD_TO_PARENT Expression("ENTIRE_WORKTOP"); YIELD_TO_CHILD (Added in V2) Pauses execution of the manifest, and yields to the specified child intent. Execution of the manifest resumes when the child calls YIELD_TO_PARENT. Examples USE_CHILD NamedIntent("child_one") Intent("subtxid_sim1achf7hzm72jwhu7vqhuauypjdfnrnnkxnfazn0ue94mkaet5uz3q5g6m2t") ; # ... YIELD_TO_CHILD NamedIntent("child_one") ; # ... YIELD_TO_CHILD NamedIntent("child_one") Bucket("my_bucket") Bucket("my_other_bucket") ; # ... YIELD_TO_CHILD NamedIntent("child_one") Expression("ENTIRE_WORKTOP") ; VERIFY_PARENT (Added in V2 | Subintent only) Sometimes, when constructing a subintent, you only want it to be used by a particular counterparty. For example, it could be used by dApps to give users or regulated integrators guarantees who will consume the subintent. The VERIFY_PARENT ; instruction can be used to ensure a subintent can only be used as a direct child of an intent which can meet some authorization criteria. Specifically, it takes an access rule (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) as an argument, and it asserts the access rule against the auth zone (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/README.md) of the parent intent's processor, which can see: - Signatures of the parent intent, via a signature requirement (../../../build/scrypto-1/auth/advanced-accessrules.md#signature-requirements) - Proofs (e.g. of badges) created during execution which are currently on the parent's auth zone We recommend using the Rust Manifest Builder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) to create this instruction, as the exact format of an access rule is quite fiddly in the manifest value model. Examples VERIFY_PARENT Enum( Enum( Enum( Enum( # Ed25519 signature badge address : public key hash hex # This is most easily created using the Rust Manifest Builder NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:6a8a691dae2cd15ed0369931ce0a949ecafa5c3f93f8121833646e15c3") ) ) ) ) ; Native Blueprints Methods callable on  Native Blueprints (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/README.md) are listed within their specifications, e.g. Account (../../radix-engine/native-blueprints/account.md#blueprint-api-function-reference) Arguments In the Radix transaction manifest, all arguments are strongly typed Manifest SBOR Values in Manifest Value Syntax (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/manifest-value-syntax.md) . When making method/function calls, the Manifest SBOR value is converted by the transaction processor into a Scrypto SBOR value, which is then used to make the engine call. As part of this engine call, the resultant Scrypto value is validated by the Radix Engine against the Component’s interface’s schema at runtime. Named buckets Buckets and proofs created in manifest can be referred by name. To create a named bucket and use it for method call: TAKE_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("1.0") Bucket("my_bucket") ; CALL_METHOD Address("account_sim1cx0l2jhr3lg8mzg3gp3xsun94vtw3440w77xddvvp4cjte5gdvz35k") "deposit" Bucket("my_bucket") ; Named proofs There are multiple ways to create a named proof to pass it by intent to a method. Most of the times, this is the syntax you would use: # Lock fees CALL_METHOD Address("account_sim1cx0l2jhr3lg8mzg3gp3xsun94vtw3440w77xddvvp4cjte5gdvz35k") "lock_fee" Decimal("10"); # Create a proof of a badge on your account. The "create_proof_of_amount" method returns a Proof to the authzone. CALL_METHOD Address("account_sim1cx0l2jhr3lg8mzg3gp3xsun94vtw3440w77xddvvp4cjte5gdvz35k") "create_proof_of_amount" Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Decimal("1"); # Get a named proof from the last proof to have been inserted in the authzone. POP_FROM_AUTH_ZONE Proof("my_proof"); # You can now pass this proof to a method/function You can also create a proof from a bucket: # Lock fees CALL_METHOD Address("account_sim1cx0l2jhr3lg8mzg3gp3xsun94vtw3440w77xddvvp4cjte5gdvz35k") "lock_fee" Decimal("10"); CALL_METHOD Address("account_sim1cx0l2jhr3lg8mzg3gp3xsun94vtw3440w77xddvvp4cjte5gdvz35k") "withdraw" Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("10") ; TAKE_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("10") Bucket("my_bucket"); # Create a proof from the bucket CREATE_PROOF_FROM_BUCKET_OF_ALL Bucket("my_bucket") Proof("my_proof"); # You can now pass this proof to a method/function # Because we withdrew tokens from our account and they are still on the # worktop, we have to deposit them back into your account CALL_METHOD Address("account_sim1cx0l2jhr3lg8mzg3gp3xsun94vtw3440w77xddvvp4cjte5gdvz35k") "deposit_batch" Expression("ENTIRE_WORKTOP"); ## Manifest URL: https://radix.wiki/developers/legacy-docs/reference/transactions/manifest/manifest Updated: 2026-02-18 Summary: A manifest (also known as a “transaction manifest” or “intent manifest”) is the main part of each intent(../transaction-overview.md) in a transaction. Reference > Transactions > Manifest — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/README.md) A manifest (also known as a “transaction manifest” or “intent manifest”) is the main part of each intent (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-overview.md) in a transaction. Overview Manifests are human readable and composable. They make it possible to compose multiple actions to be executed atomically by describing a sequence of component calls and movements of resources between components. In short, full atomic composability becomes possible directly in transactions. Manifests can also describe the use of badges for authorization to components, payment of transaction fees, and checks on resources amounts to provide guaranteed results for the user. Manifests are human-readable so that developers or client software (such as the Radix Wallet (https://github.com/gguuttss/radix-docs/blob/master/use/radix-wallet-overview.md) ) can understand what they are signing. When it’s time to submit, the transaction manifest is translated into a binary representation and cryptographically signed to create a final transaction (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-overview.md) that may be efficiently sent to the network and processed by the Radix Engine. Structure A manifest is a grouping of the following parts of an intent core (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) (in a transaction intent (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-intents.md) or subintent (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) ): - Manifest Instructions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) which will be executed by the Intent Processor (transaction-processor) in the Radix Engine - Blobs (efficiently encoded byte payloads which can be passed into method/function calls) - Zero or more child subintent (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) hashes - [System Transactions Only] Preallocated addresses These are combined with an intent header and optional message to form an intent core. Manifest Types See: - Conforming Manifest Types (conforming-transaction-manifest-types) for details on how the wallet displays manifests. - dApp Transactions (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/README.md) for examples and details on creating transactions, tailored towards dApp builders. ## Transaction Overview URL: https://radix.wiki/developers/legacy-docs/reference/transactions/transaction-overview Updated: 2026-02-18 Summary: This section gives an overview of the structure, design and motivation of the Radix transaction model. If you are an integrator looking to build and submit tran Reference > Transactions > Transaction Overview — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-overview.md) This section gives an overview of the structure, design and motivation of the Radix transaction model. If you are an integrator looking to build and submit transactions, we encourage you to read the Transactions for Integrators (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-transactions.md) . User Transactions User Transactions are also known as " Notarized (notary) Transactions". At a high level, a user transaction contains: - A transaction intent (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-intents.md) , and related signatures, including the signature of a notary. - Zero or more subintents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) , each with their related signatures. Together, the transaction intents and subintents are known as the intents of a transaction. Each intent contains an intent header, a message, and a manifest (transaction-manifest) which details a human-readable set of commands that will be executed. Structure There are two structures to a transaction, discussed in separate articles: - The tree-based intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) , with the transaction intent at the root, with sub-trees of subintents below. This structure is useful when thinking about how transactions execute, and when building transactions. - The serialized transaction structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) where the subintents are flattened for easy serialization and for referencing by subintent index. This structure is useful for those debugging, or working on transaction parsers. Ledger Transactions Transactions are committed by a node as a Ledger Transaction ( definition in code (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transactions/src/model/ledger_transaction.rs) ). They capture three classes of transaction: - User Transactions ( definition in code (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transactions/src/model/user_transaction.rs) ) - NotarizedTransactionV1 was released at Babylon (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/babylon-genesis.md) , and only has support for a single transaction intent. - NotarizedTransactionV2 was released at Cuttlefish (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/cuttlefish.md) and added support for subintents, timestamp-based validity and tip specification in basis points. - Protocol Update Transactions - "Flash" state updates - Executable transactions - Validator Transactions - Round update transactions, which are typically short/simple transactions, but occasionally trigger epoch updates about every 5 minutes Each ledger transaction has an associated LedgerTransactionHash, built as a hash on top of the hash of the transaction itself. Serialization and Preparation Transactions have a canonical serialization in bytes, using ManifestSbor (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/README.md) , according to the AnyTransaction (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transactions/src/model/any_transaction.rs) serialization. In the Rust library, you can encounter a transaction in a few forms: - Under creation, in a transaction builder, e.g. TransactionBuilder::new_v2() - Its detailed model, e.g. DetailedNotarizedTransactionV2 created from the builder - Its normal model, e.g. NotarizedTransactionV2 - Its raw bytes, e.g. RawUserTransaction - a wrapper around the canonical serialization of the transaction - Its prepared form, e.g. PreparedUserTransaction or PreparedNotarizedTransactionV2 - Its validated form, e.g. ValidatedUserTransaction or ValidatedNotarizedTransactionV2 - Its executable form, e.g. ExecutableTransaction In order to get the hashes of a transaction, it must be prepared. Executable Transactions When converted into an Executable, it includes: - Configuration details, such as tip and costing details. - Details for Transaction Validation: - The transaction and subintent hashes, for replay prevention - The combined min/max epoch and min/max timestamp across all intents in the transaction - Execution details for the transaction intent and each subintent: - Implicit Proofs (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) to add to the intent processor's authorization zone when it's created, e.g. from Signatures. - The intent's manifest (blobs, instructions e.t.c) which are sent to the Intent Processor. ## Transactions URL: https://radix.wiki/developers/legacy-docs/reference/transactions/transactions Updated: 2026-02-18 Summary: Legacy documentation: Transactions Reference > Transactions — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/README.md) ## Entity Metadata URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/metadata/entity-metadata Updated: 2026-02-18 Summary: Metadata can be added against any global entity, either at creation time, or later using the metadata module’s metadata\set role. Reference > Radix Engine > Metadata > Entity Metadata — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/metadata/entity-metadata.md) Metadata can be added against any global entity, either at creation time, or later using the metadata module’s metadata_set role. Metadata consists of key-value pairs, where the key is a string (max length 100), and the value either a single value, or a list of values, or a given entity datatype (the value has a max total length of ~4000 bytes). There is a metadata standard of common metadata entries which should be considered when configuring an entity of a given type. We also recommend that you configure an OwnerRole on any created components or resources so that you can update the metadata in future, if new rules or functionalities are added to the standard. Entity datatypes The supported entity datatypes are detailed below. Each entry value can be either a singleton of a given type, or a list of the same data type. The Singleton Id and List Id columns are the Enum variants of the data type which you may see in the manifest for the given data type. Type: String Singleton Id: 0u8 List Id: 128u8 Description: A String of text. Example .metadata(metadata! { init { "name" => "Component name", locked; "description" => "Some description", locked; "tags" => ["DEX", "radiswap"], locked; } }) Type: Bool Singleton Id: 1u8 List Id: 129u8 Description: A boolean value. Example .metadata(metadata! { init { "is_enabled" => true, updatable; } }) Type: U8 Singleton Id: 2u8 List Id: 130u8 Description: A byte (u8 value). Example .metadata(metadata! { init { "some_byte" => 3u8, fixed; } }) Type: U32 Singleton Id: 3u8 List Id: 131u8 Description: A 32-bit unsigned integer. Example .metadata(metadata! { init { "some_u32" => 3u32, fixed; } }) Type: U64 Singleton Id: 4u8 List Id: 132u8 Description: A 64-bit unsigned integer. Example .metadata(metadata! { init { "some_u64" => 3u64, fixed; } }) Type: I32 Singleton Id: 5u8 List Id: 133u8 Description: A 32-bit signed integer. Example .metadata(metadata! { init { "some_i32" => 3i32, fixed; } }) Type: I64 Singleton Id: 6u8 List Id: 134u8 Description: A 64-bit signed integer. Example .metadata(metadata! { init { "some_i64" => 3i64, fixed; } }) Type: Decimal Singleton Id: 7u8 List Id: 135u8 Description: A Decimal value. Example .metadata(metadata! { init { "pi" => dec!("3.145"), fixed; } }) Type: Address Singleton Id: 8u8 List Id: 136u8 Description: Any global address. Example .metadata(metadata! { init { "claimed_entities" => [ GlobalAddress::from(component_1), GlobalAddress::from(component_2), GlobalAddress::from(resource_1), ], mutable; "dapp_definitions" => [ GlobalAddress::from(dapp_definition_account_1), ], fixed; "friend" => component_3, } }) Type: PublicKey Singleton Id: 9u8 List Id: 137u8 Description: A public key. Example .metadata(metadata! { init { "keys" => [ PublicKey::Ed25519(Ed25519PublicKey(key_1_bytes)), PublicKey::Secp256k1(Secp256k1PublicKey(key_2_bytes)), ], mutable; } }) Type: NonFungibleGlobalId Singleton Id: 10u8 List Id: 138u8 Description: A non fungible global id (resource address + local non fungible id). Example .metadata(metadata! { init { "badge" => NonFungibleGlobalId::new( resource_address, NonFungibleLocalId::integer(1), ), fixed; } }) Type: NonFungibleLocalId Singleton Id: 11u8 List Id: 139u8 Description: A non fungible local id. Example .metadata(metadata! { init { "ids" => [ NonFungibleLocalId::string("Hello_world").unwrap, NonFungibleLocalId::integer(42), NonFungibleLocalId::bytes(vec![1u8]).unwrap(), NonFungibleLocalId::ruid([1; 32]).unwrap(), ], updatable; } }) Type: Instant Singleton Id: 12u8 List Id: 140u8 Description: An instant in time (represented as seconds since Unix epoch). Example .metadata(metadata! { init { "last_updated" => Instant { seconds_since_unix_epoch: 1687446137, }, updatable; } }) Type: Url Singleton Id: 13u8 List Id: 141u8 Description: A url to a web-based page or image. Example .metadata(metadata! { init { "info_url" => Url::of("https://tokens.radixdlt.com"), fixed; "icon_url" => Url::of("https://assets.radixdlt.com/icons/icon-xrd-32x32.png"), fixed; } }) Type: Origin Singleton Id: 14u8 List Id: 142u8 Description: An origin - ie the scheme, host and port of a URL. Used by browsers to define a distinct security context for web security. This is used to verify a two-way trusted link between a dApp on-ledger and an origin. See the metadata standard for more information. Example .metadata(metadata! { init { "claimed_websites" => [ Origin::of("https://dashboard.radixdlt.com"), ], mutable; } }) Origin metadata value Note that claimed_websites array values should not end with /. A / at the end of the url will cause an invalid origin error. Type: PublicKeyHash Singleton Id: 15u8 List Id: 143u8 Description: A public key hash (the final 29 bytes of the Blake2b hash of the key bytes). This is effectively the content of a virtual account / identity address. Example .metadata(metadata! { init { "key_hashes" => [ PublicKeyHash::Ed25519(Ed25519PublicKeyHash(key_1_hash)), PublicKeyHash::Secp256k1(Secp256k1PublicKeyHash(key_2_hash)), ], mutable; } }) Configuring metadata roles Metadata functionality lives on the "Metadata Module" of global entities. It has a number of roles, which all default to the entity’s Owner if the roles section isn’t filled in. A more complicated scrypto example setting these roles is given here: .metadata(metadata! { roles { metadata_locker => rule!(allow_all); metadata_locker_updater => rule!(allow_all); metadata_setter => OWNER; metadata_setter_updater => rule!(deny_all); }, init { "some_key" => "string_value", updatable; "empty_locked" => EMPTY, locked; } }) Updating and locking metadata Once created, metadata will typically be updated from the manifest, using the metadata module’s metadata_setter role. Updatable metadata can also be locked with the metadata_locker role. As noted above, both of these roles will fallback to the entity’s owner if not explicitly provided. So typically you will need to prove you’re the entity’s owner to update metadata. Example manifests which cover metadata are here: - Commented Examples (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transactions/examples/metadata/metadata.rtm) - Comprehensive Example Manifests (https://github.com/radixdlt/radixdlt-scrypto/tree/main/radix-transaction-scenarios/generated-examples/bottlenose/metadata) relating to Metadata configured in manifests built here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transaction-scenarios/src/scenarios/metadata.rs) ## Metadata URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/metadata/metadata Updated: 2026-02-18 Summary: Public networks like Ethereum and Radix offer a public commons where an unlimited variety of digital assets can be owned and transacted. dApps, which are made o Reference > Radix Engine > Metadata — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/metadata/README.md) Public networks like Ethereum and Radix offer a public commons where an unlimited variety of digital assets can be owned and transacted. dApps, which are made of a combination of traditional websites and servers and on-network smart contract logic (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/setting-up-for-dapp-development.md) , bring these assets to life. To make these assets and dApps meaningful and understandable to humans, we need some form of descriptive metadata. On Ethereum, the ERC-20 token smart contract standard includes functions that return things like the token’s "name" and "symbol". A wallet can use these functions to get metadata allowing more informative display of tokens. However, this approach of having metadata fields set at a smart contract level (like the ERC-20 standard) is highly constraining. For example, it may make no sense for some kinds of tokens to have a "symbol". Additionally, there will certainly be new metadata fields we can apply for tokens or NFT which are useful for specific application purposes. This is not possible for ERC-20 token smart contracts without having to deploy an entirely new smart contract with the additional metadata fields. Radix recognizes this need for metadata and makes it a universal platform feature. On Radix, creators of entities may specify specific access rules for who may modify metadata (or indeed if it may be changed at all). It is strongly recommended that creators retain the ability to update metadata so that they can take advantage of new standards in the future, or update metadata over the lifecycle of the application. This allows for entities on the Radix Network – such as Resources, Components, and Packages – have metadata set on them directly with flexibility. Radix does not limit what fields may be used. Developers may define and populate metadata fields on entities as they wish. However, the need for standardization of metadata usage is highly desirable. Standardization of metadata can enable a client software (like the Radix Wallet), third party integrations, and dashboards to display tokens and NFTs much more richly and consistently. As a result, they can offer useful features that brings about a delightful user experience and interface by relying on the consistent use of metadata. Going further, metadata is useful beyond just on-network entities. The full description of a Radix dApp typically includes not just a collection of resources and components on the Radix network, but also a website that hosts the dApp’s frontend. Below describes our current thinking in how metadata should be standardized. The Radix Wallet and Metadata Standards The creators of the Radix Wallet propose a set of optional but strongly suggested standards of metadata usage for the Radix community to adopt as a starting point. The Radix Wallet will be a champion consumer of these standards. Adopting these standards with your resources and dApps means that those things will enjoy the best possible presentation in the Radix Wallet - along with other clients and services that adopt similar rules of metadata presentation. Resources and dApps that do not adhere to these standards will of course always still appear in the Radix Wallet. Whatever metadata is present will still be available for the user to see. But the Radix Wallet will look for metadata fields like "name" and "symbol" and treat those pieces of metadata specially. The way the Radix Wallet make use of these standards may serve as a pattern for the usage by other clients. There are two essential kinds of metadata in this standard: - Metadata that is set for special display in the wallet (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) - Metadata that allows the wallet to associate and verify the various parts of a dApp (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) ## Transaction Limits URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/costing-and-limits/transaction-limits Updated: 2026-02-18 Summary: In Radix Engine, almost all payloads are encoded Scrypto SBOR(../../sbor-serialization/scrypto-sbor/README.md) or Manifest SBOR(../../sbor-serialization/manifes Reference > Radix Engine > Costing and Limits > Transaction Limits — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/costing-and-limits/transaction-limits.md) SBOR Limits In Radix Engine, almost all payloads are encoded Scrypto SBOR (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/scrypto-sbor/README.md)  or Manifest SBOR (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/README.md) , including but not limited to: - Transaction payload - Component state - Function invocation input and output - System APIs When the engine validates or decodes these payloads, it applies the following additional limits: Limit Value Description BLUEPRINT_PAYLOAD_MAX_DEPTH 48 The max depth of blueprint payload, such as component state, events and function input/output. KEY_VALUE_STORE_ENTRY_MAX_DEPTH 48 The max depth of KeyValueStore entries, both keys and values. Transaction Limits Limit Value Description MAX_NUMBER_OF_INTENT_SIGNATURES 16 The maximum number of signatures that can be used to sign a transaction intent. This does not include notary signature. MAX_NUMBER_OF_BLOBS 16 The maximum number of blobs a transaction can include. MIN_TIP_PERCENTAGE 0% The minimum tip percentage. MAX_TIP_PERCENTAGE 65,535% The maximum tip percentage. MAX_EPOCH_RANGE 8640 The maximum allowed epoch range in the transaction header. This is roughly about 30 days assuming a 5-minute epoch time. MAX_TRANSACTION_SIZE 1 MiB The maximum transaction payload size. System Limits Limit Value Description EXECUTION_COST_UNIT_LIMIT 100,000,000 The maximum number of execution cost units that a transaction can use. FINALISATION_COST_UNIT_LIMIT 50,000,000 The maximum number of finalisation cost units that a transaction can use. PREVIEW_CREDIT_IN_XRD 1,000,000 The amount of XRD that is used as transaction fee when previewing a manifest intent. MAX_CALL_DEPTH 8 The max depth of the call frames. Depth increases by 1 when a function or method is invoked and decreases by 1 after the invocation is finished. MAX_HEAP_SUBSTATE_TOTAL_SIZE 64 MiB The maximum total size of substates in the Heap.Both substate keys and values are accounted.Heap is the storage that keeps all the substates of nodes that are neither globalised nor owned by a global object. MAX_TRACK_SUBSTATE_TOTAL_SIZE 64 MiB The maximum total size of substates in the Heap.Both substate keys and values are accounted.Track is the storage that keeps all the substates of "loaded" globalised nodes and their children. MAX_SUBSTATE_KEY_SIZE 2 KiB The maximum size of a substate key. MAX_SUBSTATE_VALUE_SIZE 2 MiB The maximum size of a substate value. MAX_INVOKE_PAYLOAD 1 MiB The maximum size of invocation payload, encoding of the call arguments. MAX_EVENT_SIZE 32 KiB The maximum size of a single event. MAX_NUMBER_OF_EVENTS 256 The maximum number of events that a transaction can produce. MAX_LOG_SIZE 32 KiB The maximum size of a single log. MAX_NUMBER_OF_LOGS< 256 The maximum number of logs that a transaction can produce. MAX_PACK_MESSAGE_SIZE 32 KiB The maximum size of a panic message. WASM Limits Limit Value Description MAX_MEMORY_SIZE_IN_PAGES 8 The maximum depth of an access rule. MAX_INITIAL_TABLE_SIZE 1024 The maximum initial table size. MAX_NUMBER_OF_BR_TABLE_TARGETS 256 The maximum of BR table targets. MAX_NUMBER_OF_GLOBALS 512 The maximum number of global variables. MAX_NUMBER_OF_FUNCTIONS 8192 The maximum number of functions. MAX_NUMBER_OF_FUNCTION_PARAMS 32 The maximum of parameters in a single function. MAX_NUMBER_OF_FUNCTION_LOCALS 256 The maximum of locals in a single function. MAX_NUMBER_OF_BUFFERS 32 The max number of buffers that a Scrypto VM can allocate. Buffers are used for data exchange between system and Scrypto blueprints. Metadata Limits Limit Value Description MAX_METADATA_KEY_STRING_LEN 100 The maximum length of a metadata key (in characters). MAX_METADATA_VALUE_SBOR_LEN 4 KiB The maximum size of a metadata value (measured in bytes of the SBOR encoding). MAX_URL_LENGTH 1024 The maximum length of a URL. MAX_ORIGIN_LENGTH 1024 The maximum length of an Origin. MAX_ACCESS_RULE_DEPTH 8 The maximum depth of an access rule. MAX_ACCESS_RULE_TOTAL_NODES 64 The maximum total number of nodes in an access rule. Access Rule Limits Limit Value Description MAX_ACCESS_RULE_DEPTH 8 The maximum depth of an access rule. MAX_ACCESS_RULE_TOTAL_NODES 64 The maximum total number of nodes in an access rule. Royalty Limits Limit Value Description MAX_PER_FUNCTION_ROYALTY_AMOUNT_IN_XRD 166.666666666666666666 The maximum royalty amount (converted into XRD) that can be set on a function. This is roughly 10 USD. Blueprint Package Limits Limit Value Description MAX_FEATURE_NAME_LEN 100 The maximum length of a feature name. Note: The "features" is a capability exposed to native blueprints only. MAX_EVENT_NAME_LEN 100 The maximum length of an event name. MAX_REGISTERED_TYPE_NAME_LEN 100 The maximum length of the name of a registered type. MAX_BLUEPRINT_NAME_LEN 100 The maximum length of a blueprint name. MAX_FUNCTION_NAME_LEN 256 The maximum length of a function name. MAX_NUMBER_OF_BLUEPRINT_FIELDS 256 The maximum number of fields defined in a blueprint. Note: Scrypto blueprints have only one field as of now. ## Transaction Costing URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/costing-and-limits/transaction-costing Updated: 2026-02-18 Summary: At the very high-level, transaction costs can be broken down into the following categories: Reference > Radix Engine > Costing and Limits > Transaction Costing — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/costing-and-limits/transaction-costing.md) What are transaction costs? At the very high-level, transaction costs can be broken down into the following categories: Item Unit of Measurement Description Execution Execution Cost Unit This is to account for the CPU usage during the execution of a transaction. In this context, execution refers to the process of determining the state changes and other side effects by interpreting the instructions one-by-one upon some existing world state. Finalisation Finalisation Cost Unit This is to account for the CPU usage during the finalisation of a transaction. In this context, finalisation refers to the process of committing state changes derived from a transaction and other auxiliary indices. Storage Bytes This is to account for the additional data being added to a Node database. There are currently two type of storage costs: State Storage - The substates Archive Storage - Transaction payload, events and logs Royalties XRD and USD The amount of XRD paid to blueprint developers and component owners for the use of the code or component instances. Tip Percentage The extra amount of XRD paid to validators for the processing of a transaction. This is designed to help prioritise transaction when there is a traffic jam. How does costing work? Transaction costing is done through the costing module within the System. This module is responsible for tracking the fee balance, counting execution and finalisation cost units and applying the costs listed above, with the help of a fee reserve (https://github.com/radixdlt/radixdlt-scrypto/blob/6d35fe85de69d82b85700aaa9a68310a6163b72e/radix-engine/src/system/system_modules/costing/fee_reserve.rs#L98-L143) . At the beginning of a transaction, the fee reserve is provided with a loan of XRD to bootstrap transaction execution.  - The amount is defined as: execution_cost_unit_price * ( 1 + tip_percentage / 100) * execution_cost_unit_loan - This loan must be repaid before execution_cost_unit_loan number of execution cost units are consumed ("repay condition"), otherwise the transaction is rejected. After that, transaction execution starts, during which two processes happen - Kernel and system sends execution cost events (such as CPU usage, royalties) to the costing module: - To deduct the fee balance - To increase the cost unit counter - To repay system loan and apply deferred costs, if repay condition is met - System sends vault lock fee events to the costing module: - To credit fee balance Once execution is done, costing module is instructed to apply finalisation cost and storage cost. Costing Parameters The following parameters are used by the costing module. Protocol defined parameters Name Value Description execution_cost_unit_price 0.00000005 The price of execution cost unit in XRD. execution_cost_unit_limit 100,000,000 The maximum number of execution cost units that a transaction can consume. execution_cost_unit_loan 4,000,000 The number of execution cost units loaned from the system, to bootstrap transaction execution. finalization_cost_unit_price 0.00000005 The price of finalisation cost unit in XRD. finalization_cost_unit_limit 50,000,000 The maximum number of finalisation cost units that a transaction can consume. usd_price 16.666666666666666666 The price of USD in XRD. 1 XRD = 0.06 USD state_storage_price 0.00009536743 The price of state storage. 1 MiB = 6 USD archive_storage_price 0.00009536743 The price of archive storage. 1MiB = 6 USD Transaction defined parameters Name Description tip_percentage The tip percentage specified by the transaction. free_credit_in_xrd The free credit amount specified by a preview request. Fee Table The table below further defines the cost of each costing entry. Category Entry Description Cost Execution (execution cost unit) VerifyTxSignatures Verify transaction signature. Variable: num_of_signature * 7000 ValidateTxPayload Verify transaction payload. Variable: size(payload) * 40 RunNativeCode Run native code. Native code execution is billed in native execution unit per function based on table here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-engine/assets/native_function_base_costs.csv) . 34 native execution units = 1 execution cost unit RunWasmCode Run WASM code. WASM code execution is billed in WASM execution units per instruction based on weights here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-engine/src/vm/wasm/weights.rs) . 3000 WASM execution units = 1 execution cost unit PrepareWasmCode Prepare WASM code. Variable: size(wasm) * 2 BeforeInvoke Before function invocation. Variable: size(input) * 2 AfterInvoke After function invocation. Variable: size(output) * 2 AllocateNodeId Allocate a new node id. Node is the lower level representation of an object or key value store. Fixed: 97 CreateNode Create a new node. Variable: size(substate) + 456 DropNode Drop a node. Variable: size(substate) +1143 PinNode Pin a node to a device (heap or track). Variable: 12 + IO access MoveModule Move module from one node to another. Variable: 140 + IO access OpenSubstate Open a substate of a node. Variable: 303 + IO access ReadSubstate Read the value of a substate. Variable: If from heap: 65 + size(substate) * 2 + IO access If from track: 113 + size(substate) * 2 + IO access WriteSubstate Update the value of a substate. Variable: 218 + size(substate) * 2 + IO access CloseSubstate Cloes a substate. Fixed: 129 MarkSubstateAsTransient Marks a substate a transient. Transient substates are not committed after transaction. Fixed: 55 SetSubstate Set the value of a substate Variable: 133 + size(substate) * 2 + IO access RemoveSubstate Remove a substate Variable: 717 + IO access ScanKeys Scan substate keys in a collection Variable: 498 + IO access ScanSortedSubstates Scan substates in a collection Variable: 187 + IO access DrainSubstates Drain substates in a collection Variable: 273 * num_of_substates + 272 + IO access LockFee Lock fee Fixed: 500 QueryFeeReserve Query state of fee reserve Fixed: 500 QueryActor Query actor of this call frame Fixed: 500 QueryTransactionHash Query the transaction hash Fixed: 500 GenerateRuid Generate a RUID Fixed: 500 EmitEvent Emit an event Variable: 500 + size(event) * 2 EmitLog Emit a log Variable: 500 + size(log) * 2 Panic Panic and abort execution Variable: 500 + size(message) * 2 Finalisation (finalisation cost unit) CommitStateUpdates Commit state updates Variable, per substate: Insert or update: 100,000 + size(substate) / 4 Delete: 100,000 CommitEvents Commit events Variable, per event: 5,000 + size(event) / 4 CommitLogs Commit logs Variable, per log: 1,000 + size (log) / 4 Storage (XRD) IncreaseStateStorageSize Increase the size of the state storage Variable, per byte: 0.00009536743 IncreaseArchiveStorageSize Increase the size of the archive storage Variable, per byte: 0.00009536743 IO access cost: - Read from database and found: 40,000 + size / 10 - Read from database and not found: 160,000 Costing Runtime APIs The system exposes a list of API to query the current state of the fee reserve. See WASM API (https://github.com/radixdlt/radixdlt-scrypto/blob/6d35fe85de69d82b85700aaa9a68310a6163b72e/scrypto/src/engine/wasm_api.rs#L240-L258) and Scrypto rustdoc (https://docs.rs/scrypto/latest/scrypto/runtime/struct.Runtime.html#method.get_execution_cost_unit_limit) . Fee Distribution Cost To Proposer To Validator Set To Burn To Royalty Owners Execution 25% 25% 50% 0 Finalisation 25% 25% 50% 0 Storage 25% 25% 50% 0 Royalty 0 0 0 100% Tip 100% 0 0 0 ## Costing and Limits URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/costing-and-limits/costing-and-limits Updated: 2026-02-18 Summary: Pages in this section describe how the Radix Engine applies costing and limits during the execution of a transaction. Reference > Radix Engine > Costing and Limits — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/costing-and-limits/README.md) Pages in this section describe how the Radix Engine applies costing and limits during the execution of a transaction. Costing is the process of billing for the computation and storage that a transaction consumes. This is to encourage fair use of the network capacity while disincentivizing transaction spamming. As of now, XRD is the only currency used for costing. Limits are designed to prevent an individual transaction from placing too high a burden on any specific part of the stack, ensuring that every node meeting a minimum configuration specification is able to successfully process the transaction in a timely fashion. ## Account Locker URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/locker Updated: 2026-02-18 Summary: The locker package, and the account locker blueprint in particular, offer a new pattern for applications to handle account deposits in a way that doesn't fail d Reference > Radix Engine > Native Blueprints > Account Locker — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/locker.md) Introduction The locker package, and the account locker blueprint in particular, offer a new pattern for applications to handle account deposits in a way that doesn't fail due to account deposit settings, does not require authorized depositor or user badges, and allows applications to not keep track of claims or resources that they owe to users. This comes from the fact that accounts on the network can potentially reject deposits that do not align with their account deposit rules. One solution to enable direct account deposits is the authorized depositor badge concept (account.md#authorized-depositors) . However, authorized depositor badges are not ideal for all use cases, as they require prior coordination between the recipient and the application wishing to send tokens to them. Asset bridges are a good example of a case where it's important that the depositor has a guaranteed way of being able to make sure that the intended recipient can get their tokens, regardless of whether they've pre-authorized the bridge. Refunds on cross-network bridging are not always possible, and most bridges are built around the notion of a "fire-and-forget" pattern where the asset is sent on the receiving network and there is no follow-up to handle failure cases. However, it's equally important that Radix users who have configured their accounts to restrict depositing to not be bothered by unexpected tokens showing up. While the above example is on bridges, the above set of problems and the solutions presented in this document generalize to any application which is interested in sending resources to accounts who may not be currently configured to receive them, without doing any special bookkeeping for the failed deposits. This document is on the locker package that contains blueprints that allow for resources to be stored within them and claimed if a claim check is successful. This package currently contains a single blueprint: the AccountLocker blueprint. Other blueprints can be added in the future that perform checks other than reading and asserting against the owner's role. AccountLocker components allow for an administrator of a component to deposit resources, specify who can claim the resources, and users can then claim these resources. At claim time, the blueprint checks if the caller can demonstrate owner powers over the account by reading the owner role of the account and asserting against it. If the assertion passes, then the caller has owner powers over the account and the resources can be returned to them. Otherwise, the claim fails. Account lockers are also able to first attempt to deposit the resources into the claimant's account and store them in the account locker if the account deposit fails due to account deposit rules. Resources stored in the account locker can potentially be recovered by an administrator if they have not been claimed and that power could be given up at instantiation time. Use Cases The account locker blueprint is generally useful for any application that needs to deposit resources directly into accounts where some of those deposits can fail and not want to do any additional bookkeeping on the failed deposits such as keeping them to be claimed later and issuing a user claim badge or asking users to add the application's authorized depositor badge to their account. It allows applications to first attempt the deposit into accounts and then store them in their (the application) locker if the deposit fails where they can be claimed later on. This section has some concrete use cases for this blueprint. - Airdrops: In a similar way to bridges and exchanges, airdrop transactions can potentially fail if the account deposit rules do not allow for the airdrop. Historically, if the airdrop was of substantial value, then the airdropper would need to keep track of failed airdrops off-ledger to attempt them later. This adds complexity to what is a conceptually simple operation. The account locker blueprint with its airdrop method simplifies airdrops drastically where the blueprint will first attempt to deposit the resources into the accounts and if the deposit fails then it will store them in the account locker for the account owners to claim them. - Exchanges: Some accounts might have their deposit rules configured to not allow resources that they don't currently have any of. This can lead some exchange withdrawal transactions to fail since the deposits are rejected by the user. To overcome this, an exchange might choose to setup an account locker such that any deposits rejected by the account are deposited into the account locker for the account to claim them at a later point. This eliminates the chance of exchange transactions failing due to account deposit rules and is much easier to setup compared to authorized depositor badges. - Bridges: The account locker blueprint can be useful to bridges when assets are being sent to the Radix network. A bridge would first attempt to deposit the resources directly into the account and if that deposit fails they can store them in their locker so they can be claimed by the account owner later on. This approach removes the friction that comes with authorized depositor badges and user claim badges. Features The features of the AccountLocker blueprint are as follows: - An account-first design with an interface optimized for use with the native account blueprint. - The ability for an administrator (called the storer) to deposit resources into their locker component. - The ability for an administrator (called the recoverer) to forcefully withdraw resources from their locker component. - The ability for an administrator (called the recoverer) to give up their ability to forcefully withdraw resources from the locker component. - The ability for a user to claim resources destined to their account. - The ability for the locker blueprint to determine if the claim is allowed to go forward or not by reading and asserting against the owner role of the account. The functionality provided by the account locker blueprint is perhaps better explained by the state machine seen below: This state machine represents the movement of resources between an account, the worktop, and an account locker component. The resources start in an account and could be withdrawn to the worktop. If the store method is called by the storer role then the resources move from the worktop and into the locker component to be stored and returned to the claimant when they request to claim. If the try_direct_send flag was set to true then the locker attempts to deposit them into the destination (claimant) account. The deposit will be done using try_deposit_or_refund meaning that if the deposit fails the resources will be refunded back to the locker. In such cases, the locker will store the resources to be claimed later on. Resources leave the locker and go to the worktop by calling the claim or recover methods. The claim method is callable by users to claim the resources that are locked for them in the account locker while the recover method is callable by the recoverer role to recover resources that have been stored in the account locker but have not been claimed. The ability for resources to be recovered can be given up by setting the recoverer and recoverer_updater roles to AccessRule::DenyAll. Once the resources are back in the worktop they can be deposited into the account or perhaps even back into the locker. Roles This section defines the different roles that exist on the account locker blueprint, their capabilities, and their upgradeability. Role Name Capabilities Update Owner The owner role is not given any explicit capabilities aside from the implicit ability to update metadata on the component. This is determined by the instantiator of the component and whether they pass in an OwnerRole::Updatable or OwnerRole::Fixed when instantiating the component. storer An administrator role with the authority to store resources in the locker for a specific account to claim. Updatable by the storer_updater role. storer_updater Controls who can update the storer role access rule. It can update itself. recoverer An administrator role with the authority to forcefully withdraw (recover) any assets previously deposited into the locker out of it. Updatable by the recoverer_updater role. recoverer_updater Controls who can update the recoverer role access rule. It can update itself. State The state of the account locker blueprint has no fields and just has a single collection where the key is the claimant account address and the value is a KeyValueStore. Interface The interface of the AccountLocker has an account-first interface where all of the methods and functions take in a Global and not just a ComponentAddress. Functions Name instantiate Type Function Callable By Anyone Description This function instantiates a new account locker returning a Global back to the caller configured based on the passed arguments. Events None Arguments owner_role: OwnerRole - The definition of the role that owns the instantiated component. As described in the "Roles" section, the owner role is not given any powers aside from the implicit metadata roles. storer_role: AccessRule - The access rule to assign to the role that can deposit resources into the account locker for user accounts to claim at a later point. storer_updater_role: AccessRule - The access rule to assign to the role that can update the AccessRule controlling who can deposit resources into the account locker component. recoverer_role: AccessRule - The access rule to assign to the role that can forcefully withdraw resources out of the account locker component. recoverer_updater_role: AccessRule - The access rule to assign to the role that can forcefully withdraw resources out of the account locker component. address_reservation: Option - An optional address reservation to use when globalizing the account locker component. Returns Global - A reference to the global account locker component instantiated in this function. Name instantiate_simple Type Function Callable By Anyone Description This function instantiates a new account locker returning a Global back to the caller configured based on the passed arguments. This is a second constructor for the blueprint that is meant to be much simpler than the instantiate function. This creates a new admin badge resource and uses it as the owner, storer, storer_updater, recoverer, and recoverer_updater depending on whether the instantiator wishes to allow for forceful withdraws from the component. If the allow_recover argument is set to true then the admin badge created in this function will be set as the recoverer and recoverer_updater role. Otherwise, if it's set to false then those roles are set to rule!(deny_all) which effectively means that no one is allowed to forcefully withdraw resources from the account locker. Once a claim is deposited into it then it either gets claimed or stays there forever. Under the hood, aside from the creation of the admin badge resource, this function will call instantiate. Events None Arguments allow_recover: bool - A boolean that controls whether forceful withdraws of resources in the account locker should be allowed for the admin badge created by this function. If true then the admin badge can withdraw any resources in the component, otherwise, no one can perform forceful withdraws. Returns Global - A reference to the global account locker component instantiated in this function. Bucket - A bucket containing the admin badge created in this function. Methods storer Role Methods This section contains the methods that are callable by the storer role which are: store and airdrop. These two methods perform the same functionality: Storing resources in the locker for users to claim and potentially first attempting to deposit the resources into the claimant's account before storing them if a try_direct_send flag is set to true. The main difference between these two methods is that the airdrop method can be thought of as a batch version of store that provides an interface that makes airdrops simpler. Where store takes in a single claimant account, airdrop takes in multiple claimant accounts and distributes the passed bucket according to the specified amounts/ids and then calls store for each claimant and bucket. Name store Type Method Callable By storer Role Description A privileged method that can be called only by the storer role to store some resources in the locker to be claimed by a particular account or attempt to deposit them into the claimant’s account. If the account does not have any prior resources locked in the locker then a new KeyValueStore will be created for the account. Similarly, if a particular account and resource address pair does not have a corresponding vault then a new entry will be added to the aforementioned key-value store with a newly created vault. The behavior of this method changes depending on the try_direct_send flag. When true this method will first attempt to deposit the resources into the account. If the deposit fails then the resources will be stored in the locker. Otherwise, if the flag is false then no deposit will be attempted and the resources will be stored in the locker for the account to claim. This method emits a StoreEvent when resources are stored in the locker. In cases when a deposit is attempted and succeeds then no StoreEvent is emitted as nothing was stored in the locker. Events A StoreEvent is emitted for each account claim stored in the locker. Any claims that were successfully deposited into their destination accounts will not have this event emitted for them. Arguments claimant: Global - A global account address of the account that can claim the stored resources. If true then the admin badge can withdraw any resources in the component, otherwise, no one can perform forceful withdraws. bucket: Bucket - A bucket of resources to store in the account locker for the claimant account allowing them to claim it at a later point. try_direct_send: bool - Controls whether this method will first attempt to deposit the resources into the claimant account or not. If true then a deposit into the account will be attempted. Returns None Name airdrop Type Method Callable By storer Role Description A privileged method that can be called only by the storer role to perform an airdrop into some accounts. This method takes in a single bucket of resources and a map of claimants and the amount that they’re to be given. The bucket is distributed among the claimants according to the amounts/ids specified for each claimant. The behavior then differs based on the try_direct_send flag. For each claimant and bucket if the try_direct_send flag is set to true then the locker component will first attempt to deposit the resources into the account and if the deposit fails then they will be stored in the locker. Since this method distributes the bucket among the claimants there could potentially be some change at the end. In this case, a Bucket is returned from this method with that change. If the try_direct_send flag is set to true some deposits might succeed and some might fail, in which case the ones that failed will be stored in the locker (not returned back to the caller). A StoreEvent is emitted for each claim that is stored in the locker meaning that ones that were successfully deposited into accounts won’t have events emitted for them. Events A StoreEvent is emitted for each account claim stored in the locker. Any claims that were successfully deposited into their destination accounts will not have this event emitted for them. Arguments claimants: IndexMap, ResourceSpecifier> - An IndexMap of all of the claimants and the amount or ids of resources that go to them. bucket: Bucket - A bucket of resources. Ideally, the sum of all of the ResourceSpecifiers in the claimants should total up to what is in this bucket. If it does not, then some change is returned. This bucket will be split up across the claimants based on the specified ResourceSpecifier try_direct_send: bool - Controls whether this method will first attempt to deposit the resources into the claimant account or not. If true then a deposit into the account will be attempted. Returns Option - A bucket of change of the unused resources. recoverer Methods This section has an API reference for the methods that are callable by the recoverer role which are two methods: recover and recover_non_fungibles. Both of these two methods provide the same functionality: the ability for the recoverer role to forcefully withdraw resources from the account locker that they might have previously committed to the locker. This can be useful in many cases including airdropped resources that were not claimed within a specified period. The main difference between these two methods is the same as the difference between the withdraw and the withdraw_non_fungibles method on account: it's whether the recovery will happen based on the amount or ids of resources. Name recover Type Method Callable By recoverer Role Description A privileged method that can be called only by the recoverer role to forcefully withdraw resources from the account locker. This method allows the recoverer role to recover or forcefully withdraw resources from the account locker. This might be useful in cases when the resources have been there for a while and have not been claimed. If the recoverer and recoverer_updater are set to rule!(deny_all) then no resources can be recovered from the account locker. This method follows the behavior of the withdraw method on the account blueprint in terms of the fungible and non-fungible treatment. More specifically it allows for recovery of both fungible and non-fungible resources by amount. Events A RecoverEvent is emitted when a claim is recovered. Arguments claimant: Global - A global account address of the account to forcefully withdraw resources from their claims. resource_address: ResourceAddress - The address of the resource to forcefully withdraw. amount: Decimal - The amount of resources to forcefully withdraw. Returns Bucket - A bucket of the resources forcefully withdrawn. Name recover_non_fungibles Type Method Callable By recoverer Role Description A privileged method that can be called only by the recoverer role to forcefully withdraw resources from the account locker. This method allows the recoverer role to recover or forcefully withdraw resources from the account locker. This might be useful in cases when the resources have been there for a while and have not been claimed. If the recoverer and recoverer_updater are set to rule!(deny_all) then no resources can be recovered from the account locker. Events A RecoverEvent is emitted when a claim is recovered. Arguments claimant: Global - A global account address of the account to forcefully withdraw resources from their claims. resource_address: ResourceAddress - The address of the resource to forcefully withdraw. ids: IndexSet - The set of non-fungible local ids to recover. Returns Bucket - A bucket of the resources forcefully withdrawn. User Methods This section has an API reference for the methods that are publicly callable on locker components which are the claim and claim_non_fungibles methods. Much like the recoverer role methods, the main difference between these two methods is the same as the difference between the withdraw and the withdraw_non_fungibles method on account: it's whether the claim will happen based on the amount or ids of resources. Name claim Type Method Callable By Anyone Description A public method called by the claimant to claim their resources from the account locker. To determine if the claim is allowed to go through this method reads the claimant’s owner role and asserts against it. If the assertion is successful then the claim is allowed to go through, otherwise, the transaction fails. If the owner check succeeds then the amount specified as an argument will be claimed from the vault associated with the passed claimant and resource address. This method follows the behavior of the withdraw method on the account blueprint in terms of the fungible and non-fungible treatment. More specifically it allows for claiming of both fungible and non-fungible resources by amount. Events A ClaimEvent is emitted when resources are claimed. Arguments claimant: Global - A global account address of the claimant. resource_address: ResourceAddress - The address of the resource to claim. amount: Decimal - The amount of resources to claim. Returns Bucket - A bucket of the resources claimed. Name claim_non_fungibles Type Method Callable By Anyone Description A public method called by the claimant to claim their resources from the account locker. To determine if the claim is allowed to go through this method reads the claimant’s owner role and asserts against it. If the assertion is successful then the claim is allowed to go through, otherwise, the transaction fails. If the owner check succeeds then the amount specified as an argument will be claimed from the vault associated with the passed claimant and resource address. Events A ClaimEvent is emitted when resources are claimed. Arguments claimant: Global - A global account address of the claimant. resource_address: ResourceAddress - The address of the resource to claim. ids: IndexSet - The set of non-fungible local ids to claim. Returns Bucket - A bucket of the resources claimed. Getter Methods Name get_amount Type Method Callable By Anyone Description A public method that can be called by anyone to get the amount of resources currently available in a claimant's vault. Events None Arguments claimant: Global - A global account address of the claimant. resource_address: ResourceAddress - The address of the resource to get the amount of. Returns Decimal - The amount of the resources in the vault. Name get_non_fungible_local_ids Type Method Callable By Anyone Description A public method that can be called by anyone to get the non-fungible local IDs of resources currently available in a claimant's vault. Events None Arguments claimant: Global - A global account address of the claimant. resource_address: ResourceAddress - The address of the resource to get the ids of. limit: u32 - An upper limit on the number of ids to return. Returns IndexSet - The set of the first limit non-fungible local ids in the vault. Events The structure of the events that can be emitted by the account locker blueprint is as follows: #[derive(ScryptoSbor, ScryptoEvent, Debug, Clone, PartialEq, Eq)] pub struct StoreEvent { pub claimant: Global, pub resource_address: ResourceAddress, pub resources: ResourceSpecifier, } #[derive(ScryptoSbor, ScryptoEvent, Debug, Clone, PartialEq, Eq)] pub struct RecoverEvent { pub claimant: Global, pub resource_address: ResourceAddress, pub resources: ResourceSpecifier, } #[derive(ScryptoSbor, ScryptoEvent, Debug, Clone, PartialEq, Eq)] pub struct ClaimEvent { pub claimant: Global, pub resource_address: ResourceAddress, pub resources: ResourceSpecifier, } #[derive(Clone, Debug, ScryptoSbor, ManifestSbor, PartialEq, Eq)] pub enum ResourceSpecifier { Fungible(Decimal), NonFungible(IndexSet), } As seen in the Interface section, three main event types are emitted by the account locker. - StoreEvent - This is an event emitted when resources are stored in the account locker and is potentially emitted when calling the store and airdrop methods. It is only emitted for resources that were actually stored in the account locker, resources that were successfully deposited into accounts do not emit this event. - RecoverEvent - This is an event emitted when resources are forcefully withdrawn from the account locker and is emitted when calling the recover and recover_non_fungibles methods. - ClaimEvent - This is an event emitted when resources are claimed from the account locker and is emitted when calling the claim and claim_non_fungible methods. The events should be reconcilable and the state of the account locker can be determined through events alone: - Fungible Resources: Given a particular claimant account address and resource address the amount that the account locker has for it to claim can be determined by summing the StoreEvent for the claimant and resource and subtracting the RecoverEvent and ClaimEvent for the claimant and resource. - Non-Fungible Resources: Given a particular claimant account address and resource address the non-fungible IDs that the account locker has for it to claim can be determined by sequential union and difference operations on the set of non-fungibles the locker has, in the order that the events were emitted. Before observing any events we start with an empty set of non-fungibles the locker has. StoreEvents are union operations with the stored non-fungibles. RecoverEvents and ClaimEvents are difference operations with the stored non-fungibles. Clients interested in reconciling the state of an account locker through events alone can find a reference implementation of state reconciliation in the account locker tests here (https://github.com/radixdlt/radixdlt-scrypto/blob/2fe31d6ea6ef0434e706eb5196535fd31beeb9a2/radix-engine-tests/tests/blueprints/account_locker.rs#L2910-L2942) . Although this example is in Rust, the logic is still the same as what is described above regardless of what the language is. Metadata When a locker component is instantiated it comes with no metadata whatsoever. It’s the job of the owner role to add metadata to it as it sees fit. Standards-wise, a locker component would just need to have the typical dapp_definition metadata field that components have as described by the "Metadata for Wallet Verification" (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) document. The metadata standard of dApp definitions would change very slightly to accommodate for locker components: - DApp definition accounts would have a new metadata field called account_locker of the type GlobalAddress which is the address of the account locker that the dApp is using. - The address of the account locker must be added to the claimed_entities vector. If the dApp does not wish to have an account locker then there is no need for the account_locker metadata field. If a dApp only adds their account locker to the claimed_entities field of the dApp definition but does not add it to the account_locker field then the wallet will not be able to discover the account locker and it will not prompt the user to claim the resources they have in that account locker. If an account locker is in the account_locker field and not in the claimed_entities field then this creates an invalid two-way link where the account locker potentially claims the dApp but the dApp does not claim the account locker. Examples Airdrop from Manifest Doing an airdrop using the account locker blueprint is simple and involves two main steps: the first step is instantiating a new account locker component from the account locker blueprint ( get the locker_package address (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/well-known-addresses.md) ) and the second is performing the airdrop by calling the airdrop method on the account locker. Notice that none of these steps require a custom Scrypto blueprint, the entire airdrop process and the potential user claims can be done through a series of transaction manifests. To create a new account locker we will use the instantiate_simple method which will create a new admin badge and set it as the owner, storer, and potentially the recoverer of the account locker. This method takes a single argument which is allow_recover that is a boolean that controls whether the admin can recover resources that have not been claimed. Since some people might not claim their airdrops in time we will set the allow_recover flag to true in this example. CALL_METHOD Address("component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh") "lock_fee" Decimal("5000") ; # Calling the `instantiate_simpl` function on the locker package to create a # new account locker. CALL_FUNCTION Address("package_sim1pkgxxxxxxxxxlckerxxxxxxxxxx000208064247xxxxxxxxxpnfcn6") "AccountLocker" "instantiate_simple" # This flag controls whether recovery of resources (forceful withdraws) are # allowed or not. We're setting it to true here as we would like to be able # to recover resources that are in the locker and have not been claimed for # a long time. true ; # At this point, the above instruction has returned an admin badge back that # we should deposit into some account. We will need this admin badge later on # to call the `airdrop` method and perform the airdrop. CALL_METHOD Address("account_sim1c8m6h4yv2x9ca0wx5ddtl0nctqmjt2t740wfjgj9w8sdz82zf8ppcr") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") Enum<0u8>() ; When the above manifest is executed we get an account locker and account locker badge with the following addresses: - Account Locker Component: locker_sim1dpu34m92wkkua3l0gmnre543plaxrlzezgjdpn3t7zx2f7wswv5jeq - Account Locker Admin Badge: resource_sim1t5cyx3cv33nlxrnqjrl2thgkm3pecvsvm4kccannxv5pq0ydckv9n9. We will airdrop some XRD from the faucet to users. To perform the airdrop we will first need to create a proof of the account locker admin badge to allow us to call the airdrop method. Then, we can take the resources from the worktop and pass them to the airdrop method alongside a map that describes the amount that should go to each user. We will be setting the try_direct_send flag to true so that the locker first attempts to deposit the resources into the user accounts and then stores them in the locker if the deposit fails. The following is the manifest we're using to airdrop resources in this example: CALL_METHOD Address("component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh") "lock_fee" Decimal("5000") ; # Creating a proof of the locker admin badge. This is required to be able to call the `airdrop` # method on the account locker. CALL_METHOD Address("account_sim1c8m6h4yv2x9ca0wx5ddtl0nctqmjt2t740wfjgj9w8sdz82zf8ppcr") "create_proof_of_amount" Address("resource_sim1t5cyx3cv33nlxrnqjrl2thgkm3pecvsvm4kccannxv5pq0ydckv9n9") Decimal("1") ; # Getting some XRD from the faucet to deposit to the users and taking them from the worktop and into # a bucket. CALL_METHOD Address("component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh") "free" ; TAKE_ALL_FROM_WORKTOP Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Bucket("bucket1") ; CALL_METHOD Address("locker_sim1dpu34m92wkkua3l0gmnre543plaxrlzezgjdpn3t7zx2f7wswv5jeq") "airdrop" # A map of the accounts to airdrop resources to and the amount to airdrop. The following means # that we're airdropping to 5 accounts 1 XRD each. Map( Address("account_sim168fghy4kapzfnwpmq7t7753425lwklk65r82ys7pz2xzleehgpzql2") => Enum<0u8>( Decimal("1") ), Address("account_sim168xl3zsangfxv76ma08hfsrdqt546w8mttjy6h7q7stfnngpdu3se2") => Enum<0u8>( Decimal("1") ), Address("account_sim169490zsun80mg3y0j23ghccm2sw0a4f0rdshxnj2alqcj98c4haksj") => Enum<0u8>( Decimal("1") ), Address("account_sim16xgxu5za5du40x04e0ucxfwxquryrlaezjpvucr2vy26thckmerguq") => Enum<0u8>( Decimal("1") ), Address("account_sim16ypm9kwhamw67kpjhd5rcdpvy3s7levkvr537lpeppjwl6ju0z7k0c") => Enum<0u8>( Decimal("1") ), ) # The bucket of XRD to split across the recipients. Bucket("bucket1") # This flag controls if the account locker should first attempt to deposit the resources in the # claimant accounts or not. When `true` a deposit will be attempted. If the deposit fails then # the resources will be stored in the account locker. true ; # Any resources or change that remains will be deposited into our account. CALL_METHOD Address("account_sim1c8m6h4yv2x9ca0wx5ddtl0nctqmjt2t740wfjgj9w8sdz82zf8ppcr") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") Enum<0u8>() ; Programmatic Locker from Scrypto This is another example that uses the account locker blueprint to build a gumball machine that, instead of returning the gumball token back, attempts to deliver it directly and falls back to storing it in an account locker that it controls, for the user to claim later. use scrypto::prelude::*; #[blueprint] mod gumball_machine { struct GumballMachine { /// The account locker that the gumball machine uses to send resource to /// users. account_locker: Global, /// A reference to the resource manager of the gumball resource. gumball_resource: ResourceManager, } impl GumballMachine { pub fn instantiate() -> Global { // For convenience we will be setting the global caller badge of // this component as the owner, storer, and recoverer of the account // locker and the owner and minter of the gumball resource. To do // this we must first allocate a global address to derive the global // caller badge from. let ( gumball_machine_address_reservation, gumball_machine_component_address, ) = Runtime::allocate_component_address( GumballMachine::blueprint_id(), ); let global_caller_badge_rule = rule!(require(global_caller( gumball_machine_component_address ))); // Instantiating a new account locker component. let account_locker = Blueprint::::instantiate( OwnerRole::Updatable(global_caller_badge_rule.clone()), global_caller_badge_rule.clone(), global_caller_badge_rule.clone(), global_caller_badge_rule.clone(), global_caller_badge_rule.clone(), None, ); // Creating a new resource that is only mintable by the gumball // machine. let resource_manager = ResourceBuilder::new_fungible( OwnerRole::Updatable(global_caller_badge_rule.clone()), ) .divisibility(0) .mint_roles(mint_roles! { minter => global_caller_badge_rule.clone(); minter_updater => global_caller_badge_rule; }) .create_with_no_initial_supply(); // Instantiate and globalize the component. Self { account_locker, gumball_resource: resource_manager, } .instantiate() .prepare_to_globalize(OwnerRole::None) .with_address(gumball_machine_address_reservation) .globalize() } /// Calls `store` on the account locker with `try_direct_send` set to /// [`true`] meaning that it will first attempt ot deposit the resource /// in the specified destination account and if that fails it will store /// it in the locker. pub fn free(&mut self, destination_address: Global) { let bucket = self.gumball_resource.mint(1); self.account_locker.store(destination_address, bucket, true); } } } ## Transaction Tracker URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/transaction-tracker Updated: 2026-02-18 Summary: This document offers a description of the design and implementation of the Transaction Tracker: its state and internal responsibilities. Reference > Radix Engine > Native Blueprints > Transaction Tracker — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/transaction-tracker.md) This document offers a description of the design and implementation of the Transaction Tracker: its state and internal responsibilities. Low-level details below The Transaction Tracker is an internal implementation detail of the Transaction Executor and, as such, does not offer any Public API. Background In Radix, the Transaction Tracker native component is a specialized data structure used by the Transaction Executor to keep track of the “recently” executed User Transactions: it stores the success/failure indication of each Transaction, keyed by its Intent Hash. This information allows us to validate the Intent Hash of each newly-submitted Transaction before actually executing it, i.e. to detect a potential duplicate Transaction (which is supposed to be rejected with an IntentHashPreviouslyCommitted error). Future use-case: Cancelling a Transaction Apart from the “replay protection” described above, the Transaction Tracker enables Transaction cancellation: it can store a “cancelled” Transaction Status, and the Transaction Executor’s logic is ready to interpret it as IntentHashPreviouslyCancelled. However, a public API for cancelling Transactions is not yet available. Since every Transaction is valid only within its configured Epoch range, and this configuration has a  hard-limited maximum span (https://github.com/radixdlt/radixdlt-scrypto/blob/ff21f24952318387803ae720105eec079afe33f3/radix-engine-common/src/constants/transaction_validation.rs#L19) , the Transaction Tracker only needs to keep records for some “recent” subset of the executed Transactions. Specifically: only the transactions with Epoch range ending after the current Epoch have any potential of being re-submitted (i.e. all older Transactions would be rejected anyway, due to TransactionEpochNoLongerValid). This allows us to reclaim large volumes of state used by Transaction Tracker, using a Partition-based ring-buffer approach detailed below (transaction-tracker.md#transaction-status-ringbuffer) . Network-wide Singleton The singleton Transaction Tracker’s instance is created and started by the system during Genesis (i.e. it is not supposed to be instantiated by users). You can find its well-known address for each Network here (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/well-known-addresses.md) . On-ledger State The TransactionTracker blueprint defines a single “state” field - a TransactionTrackerSubstate structure, which: - Maintains the metadata needed to interpret the Transaction Status Ring-buffer (transaction-tracker.md#transaction-status-ringbuffer) structure: - start_epoch, indicating the first Epoch covered by the start_partition. - start_partition, identifying the currently-oldest of the Partitions comprising the Ring-buffer. - Captures (on-Ledger) important constants used by the logic: - partition_range_start_inclusive - the Ring-buffer’s first usable Partition number (https://github.com/radixdlt/radixdlt-scrypto/blob/ff21f24952318387803ae720105eec079afe33f3/radix-engine/src/blueprints/transaction_tracker/package.rs#L38) (currently 65). - partition_range_end_inclusive - the Ring-buffer’s last usable Partition number (https://github.com/radixdlt/radixdlt-scrypto/blob/ff21f24952318387803ae720105eec079afe33f3/radix-engine/src/blueprints/transaction_tracker/package.rs#L39) (currently 255). - epochs_per_partition - the number of Epochs tracked in a single Partition (https://github.com/radixdlt/radixdlt-scrypto/blob/ff21f24952318387803ae720105eec079afe33f3/radix-engine/src/blueprints/transaction_tracker/package.rs#L40) (currently 100). Apart from the state field, the TransactionTracker Entity employs all its remaining Partitions as Key-Value Collections, implementing a circular buffer of expiring Transaction Statuses. Transaction Status Ring-buffer As mentioned earlier, the Transaction Tracker uses multiple consecutive Partitions as slots in a Ring-buffer covering the entire range of Transactions’ end-Epochs allowed at the current Epoch. Data structure Let’s have a look at a specific situation, based on an example current_epoch = 45168 and actual production constants: Each of these 191 Partitions (within the inclusive range [65; 255]) represents a range of 100 future Epochs. The starting Epoch (i.e. start_epoch field) grows in steps (transaction-tracker.md#buffer-rotation) of 100 (constantly catching up to the current Epoch, when possible) and thus the starting point of the Ring-buffer cycles over the available Partitions (as is usual for all circular buffers (https://en.wikipedia.org/wiki/Circular_buffer) ). The exact Epoch range of any Partition, at any moment, can be computed based on the numbers found in the state field. Each Partition simply stores the Intent Hashes and Statuses of executed Transactions which have their Epoch ranges ending within the Partition’s Epoch range. The circular buffer behavior used here is simply expiring the no-longer-needed information on sufficiently-old Transactions. Ring-buffer over-allocation A careful reader might have noticed that the Transaction’s maximum Epoch range (i.e. 30 days * 24 hours * 12 epochs = 8640) is significantly lower than the capacity of the Transaction Status Ring-buffer (i.e. 191 partitions * 100 epochs = 19100). This means that at any given time, more than half of the buffer will remain empty - contrary to the over-simplified illustration above. The over-allocation itself does not bring any significant downsides (and potentially allows for painless future adjustments of our Epoch duration or Transaction’s allowed Epoch range). Operations The Transaction Executor interacts with the Transaction Status Ring-buffer in two ways. Intent Hash Validation For each Transaction, the Executor performs (among other things) the following steps: - Validates the Transaction’s Epoch range (before even consulting the Transaction Tracker). - This rejects Transactions which are already past their end-Epoch (they would not have a corresponding slot in the Ring-buffer anymore). - And this also rejects Transactions which have their end-Epoch too far in the future (they would not have a corresponding slot in the Ring-buffer yet). - Locates the Ring-buffer slot corresponding to the Transaction’s end-Epoch. - This determines which Partition’s Key-Value Collection to look at. - Checks whether the determined Partition contains the Transaction’s Intent Hash. - If it does, then it means that current Transaction is a duplicate, and it is rejected. - This is also the point at which the not-yet-available Transaction cancellation is handled. - Actually executes the Transaction. - If the Transaction gets committed (regardless of its success/failure), the Executor inserts the Intent Hash and Status to the already-determined Partition (from point 2.). Buffer Rotation After detecting that current Epoch is greater than start_epoch + epochs_per_partition (in other words: that start_partition no longer covers the current Epoch, but only past), the Transaction Executor advances the Ring-buffer, which means that: - start_epoch grows by epochs_per_partition. - The entire Partition at start_partition is deleted from the Substate Store (i.e. cleared). - start_partition grows by 1… - …taking into account that it must cycle back to partition_range_start_inclusive once it exceeds the partition_range_end_inclusive. ## Identity / Persona URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/identity Updated: 2026-02-18 Summary: The Identity Component Rust Docs Reference > Radix Engine > Native Blueprints > Identity / Persona — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/identity.md) The Identity Component Rust Docs (https://docs.rs/scrypto/latest/scrypto/component/struct.Identity.html) ## Consensus Manager URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/consensus-manager Updated: 2026-02-18 Summary: This document offers a description of the design and implementation of the Consensus Manager component: its state, public API, and its system-triggered, interna Reference > Radix Engine > Native Blueprints > Consensus Manager — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/consensus-manager.md) This document offers a description of the design and implementation of the Consensus Manager component: its state, public API, and its system-triggered, internal responsibilities. Low-level details below If you are only looking for the Consensus Manager’s public interface, please skip directly to the API reference (consensus-manager.md#api-reference) . Background In Radix, the Consensus Manager native component is responsible for bridging the Ledger state with the world of distributed Consensus run by the Validator Nodes. This means: - Storing the current leader Validator, Round number and wall-clock time reported by the Consensus algorithm. - Triggering the Epoch changes. - Tracking all registered Validators and picking the top-staked Active Validator Set to run the Consensus. - Calculating and applying the Validator emissions/penalties according to their tracked performance statistics. - Keeping the global settings of all the rules governing the Network’s behaviors mentioned above. Network-wide Singleton The singleton Consensus Manager’s instance is created and started by the system during Genesis (i.e. it is not supposed to be instantiated by users). You can find its well-known address for each Network here (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/well-known-addresses.md) . The Consensus Manager’s internal responsibilities (consensus-manager.md#internal-responsibilities) (i.e. related to Consensus progress) are performed at the beginning of each Consensus Round, when a special system transaction calls the protected ConsensusManager::next_round() method. Apart from that, the Consensus Manager exposes public methods (consensus-manager.md#methods) for: - Using the Consensus wall-clock. - Creating a new Validator. On-ledger State The ConsensusManager blueprint defines the following fields: - config, holding the Network’s Consensus-related configuration, e.g.: - maximum number of Active Validators, - target round count and duration of an Epoch, - timeouts for unstaking / unlocking / fee changes, - amount of XRD emitted for Validators on each Epoch change, - cost of creating a Validator, - minimum reliability expected from a Validator (according to which the Network Emission penalties are applied). - state, holding the Consensus’ progress: - current Epoch and Round, - timestamps of the current Epoch’s start: - the “effective start, used to maintain the Epoch’s target duration; - the “actual start”, used only to measure the potential drift of the Epoch’s start. - a reference to the current Leader (among the Active Validator Set). - validator_rewards containing: - the actual XRD Vault holding all Network Fees and Tips collected during the current Epoch (to be distributed on the Epoch change), - accounting information tracking the Rewards assigned to individual Leaders for transactions they proposed (used to calculate the Rewards distribution from the Vault mentioned above). - current_validator_set, which simply represents the Active Validator Set of the current Epoch, as a list of component addresses, public keys and stakes (captured at the Epoch’s start). - current_proposal_statistic, which tracks a number of successful vs missed proposals of each Validator of the current Epoch (for the Rounds during which it was a Leader). - proposer_minute_timestamp holding a minute-resolution counterpart of the proposer_milli_timestamp described below. - proposer_milli_timestamp holding a millisecond-resolution Unix timestamp captured by the Leader at the moment of proposing the most recently committed Round. Note: the Engine’s logic ensures that this number is non-decreasing (even if there is a clock skew between consecutive Leaders). And a single collection: - registered_validators_by_stake - a sorted index of all currently registered Validators, by their stake (descending): - The sort key is calculated as a scaled down and inverted stake of a Validator. This is done so that this single 16-bit integer represents an approximate magnitude of a stake, with the highest stakes being first (in the natural ordering). - This structure ensures convenient and performant selection of next Active Validator Set. Internal responsibilities The public-facing services provided by the Consensus Manager are quite modest (accessing the wall-clock and creating new validator components). However, apart from those, it has an important internal Consensus-related responsibility: on each Round, the Consensus Leader automatically triggers the ConsensusManager’s protected next_round() method, which progresses the Consensus’ Rounds and Epochs. Round Change The Leader of the Round starts it with a special transaction, which provides the details of the new Round (i.e. as inputs to the next_round() method being called). This method then: - Updates the wall-clock (i.e. the proposer_milli_timestamp and proposer_minute_timestamp substates) with the timestamp provided in the input. - The new timestamp is validated: it must be greater or equal to the previous one, or the transaction will fail. - Updates the proposal statistics of the current Epoch (i.e. the current_proposal_statistic). - In most cases, on a healthy network, this will simply increment a number of successful rounds of the current Leader. - However, it may happen that one or more Rounds were skipped before the one being started. In such case, the current Leader will provide the list of past Leaders which missed their Rounds (and their “miss counters” will be incremented accordingly). - The statistics are accumulated this way throughout the Epoch - they will be used for calculating potential Emission penalties of each Validator, at the end of the Epoch. - Updates the current Leader reference (within the state substate). - Determines whether the current Epoch should end (i.e. whether the currently started Round should in fact become the first round of a newly-started Epoch). This simply evaluates the Epoch change condition stored within the config substate: - If the started Round’s number is strictly less than the configured minimum Round count ( currently on mainnet: (https://github.com/radixdlt/babylon-node/blob/d2af01b8f27df69b7cfc07d784777a3490a8c640/core-rust-bridge/src/main/java/com/radixdlt/genesis/GenesisConsensusManagerConfig.java#L134) 500), the Epoch definitely continues. - If the started Round’s number is greater or equal to the configured maximum Round count ( currently on mainnet: (https://github.com/radixdlt/babylon-node/blob/d2af01b8f27df69b7cfc07d784777a3490a8c640/core-rust-bridge/src/main/java/com/radixdlt/genesis/GenesisConsensusManagerConfig.java#L135) 3000), the Epoch definitely ends. - If the above hard-limitting special cases do not occur, then the Epoch ends as soon as it lasted for the configured target Epoch duration ( currently on mainnet: (https://github.com/radixdlt/babylon-node/blob/d2af01b8f27df69b7cfc07d784777a3490a8c640/core-rust-bridge/src/main/java/com/radixdlt/genesis/GenesisConsensusManagerConfig.java#L131) 5 minutes). - Applies the actual Round/Epoch change: - If the Epoch is not supposed to end, then simply a new Round’s number is updated (within the state substate), and a RoundChangeEvent is emitted. - Otherwise, an Epoch Change is applied, as detailed in the section below. Epoch Change When the Round Change logic (i.e. point 4. described above) determines the end of the current Epoch, the Consensus Manager: - Calculates and distributes the Network Emissions to Validators belonging to the ended Epoch’s Active Validator Set: - First, for each Validator, we calculate its “reliability factor”. This is simply rescaling its absolute successful proposal ratio (recorded within current_proposal_statistic) into a value relative to the minimum required reliability (set in the config). Example Let’s assume a Validator had 7 successful and 3 missed Rounds. Then, its absolute reliability is 0.7. If the minimum required reliability is configured as 0.6, then the rescaling (0.7 - 0.6) / (1.0 - 0.6) gives us the reliability factor of 0.25. Mainnet configuration note Currently on mainnet (https://github.com/radixdlt/babylon-node/blob/d2af01b8f27df69b7cfc07d784777a3490a8c640/core-rust-bridge/src/main/java/com/radixdlt/genesis/GenesisConsensusManagerConfig.java#L146) , the minimum required reliability is set to 1.0. This means that effectively, the reliability factor of any Validator may either be 1.0 (if it did not miss any Round during an Epoch) or 0.0 (if it missed even one). - Then, a configured amount of XRD is minted for Network Emissions purposes ( currently on mainnet: (https://github.com/radixdlt/babylon-node/blob/d2af01b8f27df69b7cfc07d784777a3490a8c640/core-rust-bridge/src/main/java/com/radixdlt/genesis/GenesisConsensusManagerConfig.java#L140) ~2853.9 per Epoch, which is approximately 300M XRD per year). - Each Validator is assigned a fraction of the above Emission, proportional to its stake (compared to the total stake of the entire Active Validator Set). - However, the actually received amount is multiplied by the Validator’s reliability factor. This means that some of the Emission may not be distributed at all. Example Let’s assume a Validator with a 200 XRD stake, and an Active Validator Set with a total stake of 1000 XRD. For simplicity, let’s say that 100 XRD is minted for Network Emissions each Epoch. In theory, our Validator is entitled to 20% of this Emission (i.e. 20 XRD). However, if the Validator missed some of its Rounds, and has reliability factor of 0.25, then it will actually receive only 4 XRD. The remaining 16 XRD will not be distributed (neither to this Validator, nor to any other from the Active Validator Set). - As an implementation detail, the “not distributed” part of the Emission is not explicitly burned, but simply calculated upfront (i.e. appropriately lower Emission is actually minted). - Distributes Proposer Rewards to the ended Epoch’s Active Validator Set: - These Rewards are already calculated (on a per-Validator basis) and stored directly in the validator_rewards substate. - For context: they contain 100% of the voluntary Tips and 25% of the Network Fees charged for executing the transactions proposed by particular Validator. - Unlike Emissions, they are received in full, regardless of the Validator’s reliability. - Calculates and distributes the remaining shared Rewards to the ended Epoch’s Active Validator Set: - This is supposed to distribute the amount remaining in the validator_rewards Vault proportionally among all Active Validators. - For context: it comes from 25% of the Network Fees charged for executing all transactions during the ended Epoch. As a side note, one can correctly observe that 50% of those Fees are burned during the transaction execution itself. - These rewards are divided according to each Validator’s effective stake: i.e. its stake multiplied by the reliability factor (the same one as used for Network Emissions distribution). Example Let’s assume that the Vault within validator_rewards substate contained 100 XRD, and after distributing 10 XRD as Proposer Rewards (according to point 2. above), 90 XRD is left for Active Validator Set’s Rewards. Let’s say that the Active Validator Set consists of only 2 Validators: A with stake 4000 XRD and reliability factor 1.0 and B with stake 1000 XRD and reliability factor 0.5. Then, the effective stake of A is 4000 XRD, and the effective stake of B is 500 XRD. This means that A will receive 80 XRD , while B will receive 10 XRD. - Any leftovers remaining in the Rewards’ Vault (due to rounding down during calculations) will simply be retrievable on the next Epoch. This will normally be negligibly small amounts. - Selects the new Active Validator Set (for the newly-started Epoch): - This simply requires listing the configured number of top-stake Validators ( currently on mainnet: (https://github.com/radixdlt/babylon-node/blob/d2af01b8f27df69b7cfc07d784777a3490a8c640/core-rust-bridge/src/main/java/com/radixdlt/genesis/GenesisConsensusManagerConfig.java#L153) 100) from the registered_validators_by_stake collection. - The selected Validators are stored in the current_validator_set, together with their stake captured at this moment. - Naturally, the current_proposal_statistic and the per-Validator counters within the validator_rewards are cleared (i.e. all Validators start the Epoch with a clean slate). - Emits the EpochChangeEvent: - It contains the started Epoch’s number and the new Active Validator Set. - Additionally, it contains a summarized information about the Protocol Versions for which the new Active Validators signalled their readiness. - Updates its state substate: - The Epoch number is incremented (always by exactly 1; Epochs cannot be skipped, unlike Rounds). - The Round number is set to 0. - The “actual” Epoch start timestamp is set to the Proposer’s timestamp. - The “effective” Epoch start timestamp is calculated, in a way designed to mitigate a small systematic drift caused by network latencies: - If the ended Epoch’s actual duration is within 10% tolerance from the configured target Epoch duration, then next.effective_start = previous.effective_start + config.target_duration (i.e. as if that Epoch had precisely the target duration). - Otherwise, the effective start timestamp is set to the Proposer’s timestamp (i.e. the same as actual Epoch start timestamp). API Reference Methods This section focuses only on publicly-available methods. The internal ones (i.e. create, start, next_round) use very restrictive access rules (or even completely custom ones), and are automatically called only by the system itself. get_current_epoch Gets the current Epoch number. This is a simple read-out from the state field. Name get_current_epoch Type Method Callable By Public Arguments None Returns Epoch: A type-safe wrapper containing the current Epoch’s number. get_current_time Gets the current Proposer timestamp (i.e. not the one committed to ledger, but the one with which it is actually being proposed), with the requested precision. Name get_current_time Type Method Callable By Public Arguments precision - TimePrecision: The required precision enum. Returns Instant: A type-safe wrapper containing the number of seconds since Unix epoch.Please note that even though it is expressed as seconds, its actual precision is driven by the argument. compare_current_time Checks whether the given condition on the current Proposer timestamp is met. Name compare_current_time Type Method Callable By Public Arguments instant - Instant: The reference timestamp to compare against.precision - TimePrecision: The comparison precision.The reference timestamp will be truncated accordingly.The proposer timestamp will be read out as in get_current_time.operator - TimeComparisonOperator: The operator to apply.The exact condition can be expressed as:get_current_time(precision) instant.truncated_to(precision) Returns bool: True if the current Proposer timestamp meets the given condition. create_validator Creates a new Validator component, allowing the caller to configure a Validator Node and potentially participate in the Consensus. Name stake Type Method Callable By Public Arguments key - Secp256k1PublicKey: The Public Key identifying the Validator.fee_factor - Decimal: An initial “Validator’s fee fraction” configuration.Please consult the relevant Validator’s documentation (validator.md#updatefee) .xrd_payment - Bucket: A bucket containing the required XRD amount.The cost is specified in the Consensus Manager’s config substate. Returns (ComponentAddress, Bucket, Bucket): A tuple with: - The address of the created Validator. - A bucket containing the newly-created Owner Badge. - A bucket with any remaining contents of xrd_payment. Events The Consensus Manager only emits the following progress-related events: RoundChangeEvent { // The new Round. round: Round } Emitted when the Round has changed - with an important exception of an Epoch change (i.e. this event is not emitted when the Round changes to Round::zero() on the beginning of an Epoch). EpochChangeEvent { // The new Epoch. epoch: Epoch, // The new Epoch's Validator Set. validator_set: ActiveValidatorSet, // A mapping of protocol version name to a total stake (within the new Epoch's // Validator Set) that has signalled the readiness for the given protocol update. // The mapping only contains entries with associated stake of at least 10% of the // total stake. significant_protocol_update_readiness: IndexMap } Emitted when the Epoch has changed. ## Intent Processor URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/intent-processor Updated: 2026-02-18 Summary: The intent process (previously known as the transaction processor) is responsible for running transactions, by interpreting an Intent Manifest(transaction-manif Reference > Radix Engine > Native Blueprints > Intent Processor — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/intent-processor.md) The intent process (previously known as the transaction processor) is responsible for running transactions, by interpreting an Intent Manifest (transaction-manifest) and execution the commands it contains. When the Radix Engine executes transactions, each intent has its own call stack, and the Intent Processor is the top call-frame of that stack - this is often referred to as the "Transaction Layer". Features of the Transaction Layer Transaction manifests orchestrate the movement of resources between components. This includes accounts, which are also components (that only their owner may withdraw from), and other intents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) in the transaction. This is done through a sequence of instructions using a special instruction set created specifically for this purpose (transaction manifests do not use Scrypto). Radix Engine processes these instructions in order, and if any step fails for any reason, the entire transaction fails and none of the steps are committed to the ledger on the Radix network. This is what is meant by the transaction being "atomic". Execution of a given transaction can be thought of as happening at its own "layer", above any components that are called during the transaction. This layer has some special features that make transaction manifests quite powerful. The Worktop The most common instruction in a transaction manifest is a component call. Each call to a component can include data and buckets of resources, and each component may then return resources. The transaction layer itself must include a way of managing resources between component calls. For this, we introduce the worktop. Each transaction has a worktop that is a place that resources may be held during the transaction execution. Resources returned from component calls are automatically put onto the worktop. From there, the manifest may specify that resources on the worktop be put into buckets so that those buckets may be passed as arguments to other component calls. The manifest may also use ASSERT commands to check the contents of the worktop, causing the entire transaction to fail if not enough of the checked resource is present. This is useful to guarantee results of a transaction, even if you may be unsure of what a component may return. Of course we know that all resources must be in a vault by the end of any transaction, so the transaction manifest creator must ensure that no resources are left hanging around the worktop or in buckets by the end of the manifest’s steps. The Authorization Zone Another key concept is the authorization zone. The Authorization Zone acts somewhat similar to the worktop, but is used specifically for authorization. Rather than holding resources, the authorization zone holds proofs. A proof is a special object that proves the possession of a given resource or resources. When a component method or blueprint function is called by the transaction manifest, the proofs currently in the transaction’s authorization zone are automatically used to validate against the authorization rules defined in that method/function’s role assignments. If this check fails, the transaction is aborted. Proofs can enter the auth zone from two places: - Signatures on the transaction are automatically added to the authorization zone as "virtual signature proofs". This, for example, is how you are able to call the withdraw method on a pre-allocated account component which is still set up to require a proof of a signature with its corresponding public key hash. - Proofs can also be returned by calls to methods. They are automatically added to the authorization zone by the transaction processor (transaction-processor) . For more about proofs and authorization, please see the Authorization model (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/README.md) . ## Validator URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/validator Updated: 2026-02-18 Summary: This document offers a description of the design and implementation of the Validator blueprint. Additionally, this document provides an API reference which docu Reference > Radix Engine > Native Blueprints > Validator — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/validator.md) This document offers a description of the design and implementation of the Validator blueprint. Additionally, this document provides an API reference which documents each of the Validator blueprint methods and functions in detail and how they can be used. Background In Radix, a Validator is a Node with a delegated “stake” (i.e. some amount of locked XRD) on the Network. The top 100 registered Validators (i.e. of highest stake) form the Active Validator Set and are responsible for executing transactions through Consensus. Each Validator is registered with the Consensus Manager. At the end of each epoch, based on various rules, the Consensus Manager chooses a new Active Validator Set of validators and stakes to run the next epoch. The Validator blueprint defines the states associated with a Validator (covered by the On-ledger State section), and the logic for modifying them (covered by the API Reference section). From the Execution POV, the Validator attains this stake from: - Individual Users (including the Validator’s Owner): They can delegate their XRD using the stake method on a Validator Component (and receive Stake Units in return). - Network Emissions: The Network emits a predefined amount of XRD at the end of each Epoch, and each Active Validator receives a part proportional to its stake. - Rewards, coming from 2 sources: - Transaction execution Fees: Each executed Transaction requires paying an appropriate Fee, part of which goes to a current Leader Validator, and part is shared among all Active Validators. - Transaction Tips: A User submitting a Transaction may specify a percentage-based Tip, to be paid to the Leader Validator. A Validator has a single role Owner: - The Owner is given responsibilities of maintaining and setting parameters of a Validator (e.g. specifying a Public Key to be used for Consensus, or a Validator Fee percentage raked from Network Emissions to the Owner). - The Owner can decide to temporarily lock some of their Stake Units in an Validator’s internal delayed-withdrawal Vault, as a public display of their confidence in their Validator Node’s future reliability. On-ledger State The Validator blueprint defines 2 fields: - The state, holding the actual configuration and sub-components used for managing the processes of Staking and Owner’s Stake Locking (see relevant sections below) - The meta-level protocol_update_readiness_signal, holding only the protocol version to which the Node (currently represented by this Validator component) is ready to upgrade. The “protocol update” flow is still under development, and will be covered in a separate document. The high-level pieces of state are: - Consensus-related configuration: - key - a Public Key, identifying the corresponding Node in the Consensus layer. See update_key API method. - is_registered - whether the Owner currently wants this Validator to be a part of Active Validator Set - i.e. if this flag is false, then even a Validator with the highest stake among them all will not be considered by the Consensus Manager when choosing a new Active Validator Set for the next epoch. See register/unregister API methods. - sorted_key - A key used internally for storage of registered Validators sorted by their stake descending. - It is only useful when the Validator is registered and has non-zero stake (the field None otherwise). - Technically, this field is only a cache (its value could be computed from the Validator’s Stake Vault) and simplifies certain updates implementation-wise. - Stake-related configuration and sub-components: - accepts_delegated_stake - whether Users other than Owner can currently delegate stake to this Validator. See update_accepts_delegated_stake API method. - stake_unit_resource - a Resource address of a fungible used as this Validator’s Stake Units. - stake_xrd_vault_id - the Validator’s Stake Vault, holding the XRDs currently staked to this Validator. Users interact with this Vault when calling stake/unstake API methods. - pending_xrd_withdraw_vault_id - the Pending Withdraw Vault, holding the XRDs that were unstaked, but not yet claimed by Users. In other words: the unstaked amount waits here during the unstake delay. See the details in the the Staking/Unstaking/Claiming section below. - claim_nft - a Resource address of non-fungible tokens used as a receipt for Stake Units unstaked from this Validator. Can be exchanged for the actual XRDs after the unstake delay (see the claim_xrd API method). The Unstake NFT section describes the token’s internal data. - Fee-related configuration: - validator_fee_factor - a fraction of this Validator’s Emissions which gets raked by the Validator's Owner. - Important note: it may happen that the value held by this field is no longer the actual fee factor; it is implicitly overridden by the latest Validator Fee Change Request after the change becomes effective - see the validator_fee_change_request description below. - The fee is raked by immediately staking the indicated fraction of the Emissions and locking the resulting Stake Units in the Owner Locked Stake Vault (see the details in the Locking Stake Units by the Owner section). - This value is expressed as a decimal factor, not a percentage (i.e. 0.015 means "1.5%" here) - validator_fee_change_request - the most recent request to change the validator_fee_factor (if any). - As noted in the update_fee API method, the change becomes effective after a delay (i.e. requires 2 weeks wait in case of fee increase). - The value from this field is moved to the validator_fee_factor only when the next change is requested while this one became effective. For this reason, the requested-and-already-effective fee will be used by the Engine instead of the outdated validator_fee_factor (this only re-iterates the Important note seen above). - Owner Locked Stake management: - locked_owner_stake_unit_vault_id - the Owner Locked Stake Vault, holding the Stake Units that this Validator's Owner voluntarily decided to temporarily lock, as a public display of their confidence in the associated Node’s future reliability (see the entire Locking Stake Units by the Owner section). This vault is private to the Owner (i.e. the Owner's badge is required for any interaction via lock_owner_stake_units/start_unlock_owner_stake_units). - pending_owner_stake_unit_unlock_vault_id - the Owner Pending Unlock Vault, holding the Stake Units which the Owner has decided to withdraw from the Owner Locked Stake Vault, but which have not yet been unlocked after the mandatory 4 weeks wait. This vault is private to the Owner (i.e. the Owner's badge is required for any interaction via finish_unlock_owner_stake_units). - pending_owner_stake_unit_withdrawals - An inline collection of all currently pending Owner Stake Units “unlocking” operations. - Schema-wise, this is an ordered map from an epoch number to an amount of stake units that become unlocked at that epoch. - Because of performance considerations, a maximum size of this map is limited to 100: a consecutive start_unlock_owner_stake_units call will first attempt to move any already-available amount to the already_unlocked_owner_stake_unit_amount field, and only then will fail if the limit is exceeded. - already_unlocked_owner_stake_unit_amount - An amount of Owner's Stake Units that has already waited for a sufficient number of epochs in the pending_owner_stake_unit_withdrawals and was then automatically moved from there. - The very next finish_unlock_owner_stake_units call will release this amount (plus any additional already-unlocked entries from pending_owner_stake_unit_withdrawals). - Technically, this field is only a cache (its value could be computed from pending_owner_stake_unit_withdrawals if it had no size limit). Staking/Unstaking/Claiming All Users may participate in the Emissions and Rewards earned by Validators by delegating their own stake to one or more Validators. This is achieved by calling the stake method of a Validator with an XRD bucket. In exchange for this XRD, User receives back some amount of fungible resource called Stake Units (specific to that Validator). They represent the fractional “ownership” the User can claim of the pool of XRD held in that Validator’s Stake Vault. After Emissions and Rewards are accrued through many Epochs, the User may then claim their proportion of the Stake Vault by calling the unstake method (and passing the Stake Units they received earlier). They will then receive an Unstake NFT which represents an amount of XRD claimable from the Validator’s Pending Withdraw Vault after a certain number of Epochs, roughly equivalent to 1 week wait. Once the target Epoch is reached, the Unstake NFT can be exchanged for actual XRD using the Validator’s claim_xrd method. The following is a sequence diagram describing this process: Staking, Unstaking and Claiming Note: This diagram avoids the complexity related to “automatic staking and locking” of the Owner’s share of Emissions and Rewards. These details do not affect the general mechanism of staking. Unstake NFT The Unstake NFT which is returned to the User upon unstaking has the following data: pub struct UnstakeData { // Always "Stake Claim". pub name: String, /// An epoch number at (or after) which the pending unstaked XRD may be claimed. pub claim_epoch: Epoch, /// An XRD amount to be claimed. pub claim_amount: Decimal, } Locking Stake Units by the Owner The “Stake Units locking” is the Validator Owner’s way of publicly displaying “their own skin in the game”. The Owner, after staking to their Validator, may at any time decide to lock their Stake Units inside the Validator’s internal Owner Locked Stake Vault. This temporary lock only adds constraints to the Owner, but is intended to serve as a signal to potential stakers, saying “I am trustworthy as a Validator, since I am economically committed to the validator running in an orderly fashion”. In future, these locked stake units may be at risk for slashing if the validator purposefully subverts the expectations of the consensus protocol. Why does it only work when the Owner’s Stake Units are locked in a dedicated, delayed-withdrawal Vault? We wish to have a signal that the Owner is committed to their Validator’s orderly behaviour. Economically speaking, this relates to their stake units not losing value. If a Validator owner were aware that their Validator might go offline / commit some form of attack which could result in their Stake Units being worth less; then they could attempt to sell or unstake these Stake Units in a week shortly before this happens. Therefore, simply holding Stake Units in their Account does not prove any ongoing commitment. Instead, we require that Owner’s stake is locked to the Validator. Currently there is a 4 week withdrawal delay, which is purposefully longer than the 1 week unstake delay. This gives time for Users to unstake if they see that a Validator Owner has started unlocking a large proportion of Owner Locked Stake. The Stake Units locked in the Owner Stake Vault are a bit trickier to unlock: the operation is started by moving the Stake Units from the Owner Locked Stake Vault to the Owner Pending Unlock Vault. The Validator Component internally tracks all such pending withdrawal requests, and only allows actual withdrawal of Stake Units from the Owner Pending Unlock Vault after a preconfigured number of Epochs (roughly equivalent to 4 weeks wait). The following is a sequence diagram describing the entire process: Locking and Unlocking Owner Stake Units Note: This diagram purposefully does not show any staking/unstaking. These are separate processes: the Owner still needs to stake to his Validator (the usual way) if he wants to obtain Stake Units to lock - and similarly, after finishing the lengthy unlocking, the Owner still needs to go through the regular unstaking process in order to obtain XRD. Why is there no “Owner Locked Stake NFT” involved in this process? All the relevant Validator methods (i.e. lock_owner_stake_units, start_unlock_owner_stake_units, finish_unlock_owner_stake_units) are only accessible to the Validator’s Owner. The locked amount and pending withdrawals are all tracked internally in the Validator’s Component state - see the On-ledger State section for details. Additionally, the Owner Stake Vault serves as a destination for the following Validator Owner’s earnings: - Validator Fee (a configured percentage raked from the Emissions), - Rewards (both from Tips and from execution Fees). Even though the above amounts are originally given as XRD, they are automatically staked (as if a regular stake method was called) and then the obtained Stake Units are locked in the Owner Stake Vault (as if a regular lock_owner_stake_units method was called). API Reference Methods stake Stakes an amount of XRD to the Validator in exchange for Stake Units. This call will fail if the Validator is currently configured to NOT accept delegated stake. Name stake Type Method Callable By Public Arguments stake - Bucket: An XRD bucket Returns Bucket: A bucket containing the Validator’s Stake Unit resource stake_as_owner Stakes an amount of XRD to the Validator in exchange for Stake Units. This call will succeed even if the Validator is currently configured to NOT accept delegated stake, but is only callable by the Owner. Name stake Type Method Callable By Owner Arguments stake - Bucket: An XRD bucket Returns Bucket: A bucket containing the Validator’s Stake Unit resource unstake Begins the process of unstaking XRD from a Validator. Removes a proportionate amount of XRD from the Stake Vault and returns an Unstake NFT. Name unstake Type Method Callable By Public Arguments stake - Bucket: A bucket of the Validator’s Stake Units Returns Bucket: A bucket containing an Unstake NFT claim_xrd Claims the XRD associated with the given Unstake NFT(s). This call will fail if any Unstake NFT’s claim_epoch has not been reached yet. Name claim_xrd Type Method Callable By Public Arguments bucket - Bucket: A bucket of Unstake NFTs Returns Bucket: An XRD bucket accepts_delegated_stake Queries whether the Validator is currently accepting delegated stake (a.k.a. allows the stake method call). Name accepts_delegated_stake Type Method Callable By Public Arguments None Returns bool: True if this Validator accepts delegated stake, false otherwise total_stake_xrd_amount Queries the total amount of XRD in the Validator’s Stake Vault. Name total_stake_xrd_amount Type Method Callable By Public Arguments None Returns Decimal: The amount of xrd in the Validator’s Stake Vault total_stake_unit_supply Queries the total amount of existing Stake Units for this Validator. Name total_stake_unit_supply Type Method Callable By Public Arguments None Returns Decimal: The amount of existing Stake Units get_redemption_value Estimates the redemption value (XRD) of the given amount of Stake Units. Name get_redemption_value Type Method Callable By Public Arguments amount_of_stake_units - Decimal: The amount of Stake Units to estimate the redemption value of Returns Decimal: The estimate XRD redemption value register Registers the Validator to be available to validate and propose transactions in Consensus. Only callable by the Owner. Note that the Validator will not immediately be entered into the Active Validator Set. The Validator will be added on the start of the next Epoch. provided it has enough stake to be in the top 100. Name register Type Method Callable By Owner Arguments None Returns Nothing unregister Unregisters the validator from validating and proposing transactions in Consensus. Only callable by the Owner. Note that the Validator must finish its responsibility of validating for the current Epoch and will only be removed from the Active Validator Set on the start of the next Epoch. Name unregister Type Method Callable By Owner Arguments None Returns Nothing update_key Updates the public key of the Validator. Only callable by the Owner. Note that the Validator’s key will only be updated and used by Consensus on the next Epoch. Name update_key Type Method Callable By Owner Arguments key - Secp256k1PublicKey: The public key to replace the Validator’s Consensus public key with Returns Nothing update_fee Updates the fee percentage which a Validator skims off the Network Emissions (and puts into the Owner’s Stake Vault). Notes: - The change is not applied immediately: - If the newly requested fee is higher than the current one, the change becomes effective after approximately 2 weeks wait. - If it is lower, then it becomes effective from the beginning of the next epoch. - Requesting consecutive change discards the previous request (if any not-yet-effective one was pending). Name update_fee Type Method Callable By Owner Arguments fee_factor - Decimal: A decimal >= 0.0 and <= 1.0 representing the new fee fraction Returns Nothing lock_owner_stake_units Locks the given Stake Units in an internal “delayed withdrawal” vault (as a way of showing the Owner’s commitment to running the Validator). Only callable by the Owner. Name lock_owner_stake_units Type Method Callable By Owner Arguments stake_unit_bucket - Bucket: A bucket of Stake Units Returns Nothing start_unlock_owner_stake_units Begins the process of unlocking the Owner’s Stake Units. The requested amount of Stake Units (if available) will be ready for withdrawal after the Network-configured number of Epochs is reached. Only callable by the Owner. Name start_unlock_owner_stake_units Type Method Callable By Owner Arguments requested_stake_unit_amount - Decimal: The amount of Stake Units to start unlocking Returns Nothing finish_unlock_owner_stake_units Finishes the process of unlocking the Owner’s Stake Units by withdrawing all the pending amounts which have reached their target Epoch and thus are already available - potentially none. Only callable by the Owner. Name finish_unlock_owner_stake_units Type Method Callable By Owner Arguments None Returns Bucket: A bucket of Stake Units update_accept_delegated_stake Updates the flag deciding whether the Validator should accept delegated stake. Only callable by the Owner. Name update_accept_delegated_stake Type Method Callable By Owner Arguments accept_delegated_stake - bool: Whether to accept delegated stake Returns Nothing signal_protocol_update_readiness Signals on ledger what protocol version to potentially change to. Used by Consensus to coordinate protocol updates. Only callable by the Owner. Name signal_protocol_update_readiness Type Method Callable By Owner Arguments protocol_version_name - String: The protocol version to signal readiness for Returns Nothing Events The Validator is the source of the following events: RegisterValidatorEvent {} // body intentionally empty Emitted when the source Validator is registered at Consensus Manager by the Owner (i.e. on register API call). UnregisterValidatorEvent {} // body intentionally empty Emitted when the source Validator is unregistered from Consensus Manager by the Owner (i.e. on unregister API call). StakeEvent { // The amount of XRD received from the User. xrd_staked: Decimal } Emitted when a User (potentially the Owner) delegates new stake to the source Validator via stake/stake_as_owner API call. Note: this deliberately does not include the auto-staked Validator Fee during Network Emission handling. UnstakeEvent { // The amount of Stake Units received from the User and burnt. stake_units: Decimal } Emitted when a User (potentially the Owner) starts the unstaking process from the source Validator (i.e. receives the Unstake NFT) via unstake API call. ClaimXrdEvent { // The amount of XRD returned to the User. claimed_xrd: Decimal } Emitted when a User (potentially the Owner) finishes the unstaking process from the source Validator (i.e. claims the XRD) via claim_xrd API call. UpdateAcceptingStakeDelegationStateEvent { // The new value of the flag. accepts_delegation: bool } Emitted on the update_accept_delegated_stake API call. ProtocolUpdateReadinessSignalEvent { // The name of the protocol version that the Node is ready for. protocol_version_name: String } Emitted on the signal_protocol_update_readiness API call. ValidatorEmissionAppliedEvent { // The *concluded* epoch for which this Emission applies. epoch: Epoch // The XRD amount in the Validator's Stake Vault captured before this Emission. starting_stake_pool_xrd: Decimal, // The XRD amount added to the Validator's Stake Vault from this epoch's Emission. // Note: This number represents the net amount, after any applicable reliability penalty // and validator fee have been subtracted. stake_pool_added_xrd: Decimal, // The total supply of the Validator's Stake Units at the moment of applying this Emission // (i.e. *before* the auto-staking of the Validator's Fee described below). // // Derivable values: // - calculating `stake_pool_added_xrd / total_stake_unit_supply` gives a convenient "XRD emitted // per stake unit" factor, which may be used to easily calculate individual staker's gains. total_stake_unit_supply: Decimal, // The XRD amount received by the Validator's Owner (according to the configured Validator Fee // percentage). // Note: This fee is automatically staked and locked in the Owner Locked Stake Vault. // // Derivable values: // - calculating `stake_pool_added_xrd + validator_fee_xrd` gives the total emission for this // Validator (entirety of which goes into its Stake Vault). // - calculating `validator_fee_xrd / (stake_pool_added_xrd + validator_fee_xrd)` gives the // Validator's configured Fee percentage effective during the Emission's period. validator_fee_xrd: Decimal, // The number of proposals successfully made by this Validator during the Emission's period. proposals_made: u64, // The number of proposals missed by this Validator during the Emission's period. proposals_missed: u64, } Emitted at the epoch change, after the Consensus Manager distributes the appropriate share of Network Emissions to the source Validator. ValidatorRewardAppliedEvent { // The *concluded* epoch for which this Reward applies. epoch: Epoch // The XRD amount rewarded (which got auto-staked and locked in the Owner's Locked Stake Vault). amount: Decimal } Emitted at the epoch change, after the Consensus Manager distributes the appropriate share of Rewards (i.e. Transaction Fees and Tips) to the source Validator. ## Pool URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/pool Updated: 2026-02-18 Summary: Liquidity pools are a concept used pervasively in a very wide range of DeFi applications. Reference > Radix Engine > Native Blueprints > Pool — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/pool.md) Resource Pools Liquidity pools are a concept used pervasively in a very wide range of DeFi applications. Users that participate in contributing to liquidity pools receive a token that represents their proportional contribution. These tokens are often called "LP tokens"; we call them more generically "pool units". The contents of the pool may shift over time (depending on the application) and ultimately the pool units are redeemable for the user’s proportion of the pool. This makes pool units an important type of asset where the user would like to have a clear indication in their wallet exactly what those pool units are worth from their pool at any given time, and be confident that there is no question of their ability to redeem them. On other networks this is virtually impossible to do with any guarantees because each pool is implemented with arbitrary logic. To show users what pool units are worth consistently and without risk, and to ensure redeemability, pools and pool units must have guaranteed predictable behavior. Fortunately the fundamental concept of the pool and pool unit is quite universal and so have created a native pool package that allows any developer to instantiate pools for their application without constraining application functionality. These native pool components and the pool units they issue allow the wallet to provide the information and guaranteed behavior that they desire, similar to other native components like accounts and validators. The pool package has three blueprints: a one-resource pool blueprint, a two-resource pool blueprint, and a more general multi-resource pool blueprint. This page documents the two-resource pool blueprint. However, information provided here is still relevant to other blueprints but with small differences to data types. Goals While pool-based application functionality varies enormously, the pool concept itself is quite simple and has a set of consistent properties: - The pool has one or more predefined token types that it holds. - Users can contribute tokens to the pool. - If there is more than one token type in the pool, the ratio of token types contributed must match the ratio of the token types in the pool. - When users contribute, they always receive back a quantity of newly-minted pool unit tokens. For a contribution of tokens equal to X% of the pool, the user receives a quantity of pool units equal to X% of the total supply of those pool units at that moment. - Users can redeem pool units for tokens from the pool. - If there is more than one token type in the pool, the ratio of token types returned in the redemption matches the ratio of the token types in the pool. - When users redeem, they send a quantity of pool units to the pool. For a quantity of pool units equal to X% of the total supply of pool units at that moment, the pool returns tokens equal to X% of the pool. The redeemed pool units are burned. - Special entities outside of the pool have the rights to directly deposit tokens to or withdraw tokens from the pool according to application-specific logic. With the above universal behavior, all of the variation of application usage of pools can be served with just three elements of pool configuration: - What token type(s) is the pool configured to accept? - What is the metadata configuration of the pool unit token, to "brand” it for users of the application? - Who/what has the rights to directly deposit and withdraw tokens from the pool according to the business logic of the application? For example, a DEX: - The DEX system instantiates a pool with two token types for the two sides of a trading pair, XRD and ZOMBO. - It sets the pool unit metadata to have the name "CoolDEX: XRD/ZOMBO”, and a specified icon URL. - It sets the authorities for the protected_deposit/protected_withdraw methods to a badge held by the DEX’s component logic. That component logic would then use those methods to conduct XRD/ZOMBO trades out of the pool according to its preferred bonding curve, as well as perform any distribution of fees to the pool. The native Pool component doesn’t presuppose what the pool means or who controls it via the protected_deposit/protected_withdraw methods; it only provides the basic universal pool functions: Contributions and withdrawals are of the correct type and adhere to the current pool ratio, and proportional pool unit minting/burning is done correctly to always represent the right share of the pool. This in turn means that, with a native pool component, a wallet or dashboard UI for pool units knows some important things with certainty: - This is in fact a pool unit that was minted by a pool (not something that behaves oddly) - A quantity of pool units is in fact redeemable at this moment for a known quantity of tokens in the pool - No application logic may stop the holder of the pool units from redeeming them at the pool Auth Roles All three of the pool blueprints come with two auth roles whose definition is configurable by the instantiator of the blueprint. These two roles have the following responsibilities: - owner: The AccessRule associated with the owner role can be configured by the instantiator of the pool. The instantiate function on the pool will set this as the owner of both the pool unit resource and the pool component. The owner is given the ability to update the metadata of the pool component and pool unit resource. - pool_manager_role: This role is given the ability to call the protected_withdraw, protected_deposit, and contribute methods on the pool components to manage and utilize the funds in the pool. Based on the above, the following is an example configuration of the owner and pool_manager_role roles that developers who use the pool blueprints may wish to adopt. Say you’re developing a radiswap style blueprint of a Constant Function Market Maker (CFMM) which makes use of the two-resource pool under the hood for elegant management of pool units and pool ownership proportions. The owner role could be configured to be a badge that is stored in the account of the owner of the protocol such that they can update metadata on their pool components and pool unit resources freely after the instantiation of their components. The pool_manager_role role could be configured to be a badge owned by the Radiswap component (or a virtual component caller badge) to allow the Radiswap component to manage the funds of the pool. Pool Unit Contributing to a pool provides liquidity providers with pool units that represent their proportion of ownership in the pool and can be redeemed for said proportion of the pool. Pool units have the following access rules configuration: Role Role Updater Mint Pool DenyAll Burn Pool DenyAll Withdraw AllowAll DenyAll Deposit AllowAll DenyAll Recall DenyAll DenyAll Update Metadata Owner Role DenyAll API Reference This section documents the interface of the two-resource pool blueprint. The information provided here is also relevant for the one-resource pool and multi-resource pool blueprints but some of the arguments and return types might be different. However, the core concepts still apply. Additional Details for the various pool blueprints can be found in the Rust docs - OneResourcePool (https://docs.rs/scrypto/latest/scrypto/component/struct.OneResourcePool.html) - TwoResourcePool (https://docs.rs/scrypto/latest/scrypto/component/struct.TwoResourcePool.html) - MultiResourcePool (https://docs.rs/scrypto/latest/scrypto/component/struct.MultiResourcePool.html) instantiate Name instantiate Type Function Description This function instantiates a new two-resource pool of the two resources provided in the resource_addresses argument of the function. The owner_role and pool_manager_rule provided as arguments to this function are set as the rule definitions of the owner and pool manager roles respectively. There are certain cases where this function panics and the creation of the pool fails. These cases are as follows: - If the resource addresses in the resource_addresses are not different (i.e., a pool is being created between the resource and itself). - If one of the resource addresses in the resource_addresses tuple is of a non-fungible resource. Callable by Public Arguments - owner_role - OwnerRole: The configuration (AccessRule and mutability) of the owner role to use for the pool component and the pool unit resource. Information on the powers given to this role can be found in the Auth Roles (pool-component#auth-roles) section of this document. - pool_manager_rule - AccessRule: The access rule to associate with the pool_manager_role. Information on the powers given to this role can be found in the Auth Roles (pool-component#auth-roles) section of this document. - resource_addresses - (ResourceAddress, ResourceAddress): A two-element tuple where each element is a ResourceAddress of the resources that this pool will be made out of. - address_reservation - Option: An optional reservation for the global address of the component being instantiated. If provided, this reservation ensures that the component will be assigned the reserved address upon globalizing. If None is passed, the system will automatically allocate an address. Returns - Global: A global TwoResourcePool object is returned of the newly instantiated pool component. contribute Name contribute Type Method Description A method that is only callable by the pool_manager_role that allows for resources to be contributed to the pool in exchange for pool unit tokens minted by the pool. When this method is called, there are four states that the pool could be in which change the behavior of the pool slightly. State 1 - Reserves: Both Empty, Pool Unit Supply: Zero Behavior: In this case, the pool is considered to be new. The entire contribution provided in the buckets argument of the method is accepted and no change is returned. The amount of pool units minted for the caller is equal to the geometric mean of the contribution provided. State 2 - Reserves: Any Empty, Pool Unit Supply: Non-Zero Behavior: In this case, the pool does not accept any contributions since the pool is considered to be in an illegal state. Despite there being no reserves in the pool, there are some pool units in circulation meaning that somebody owns some percentage of zero. Contributing to a pool that is in this state leads to a panic. State 3 - Reserves: Any Not Empty, Pool Unit Supply: Zero Behavior: In this case, the pool is considered to be new. The entire contribution provided in the buckets argument of the method is accepted and no change is returned. The amount of pool units minted for the caller is equal to the geometric mean of the contribution provided. The first contributor gets any dust that is remaining in the reserves. State 4 - Reserves: Both Not Empty, Pool Unit Supply: Non-Zero Behavior: In this case, the pool is considered to be operating normally. An appropriate amount of the provided resources are contributed to the pool and the remaining resources are returned as change. The amount of pool units minted for the caller is proportional to the amount of resources the pool has accepted as contribution. Depending on which state the pool is currently in, the pool will either accept the contribution in full, in part, or reject the contribution completely. Additionally, the amount of pool units minted changes depending on the state of the pool. There are certain cases where this method panics and the creation contribution fails. These cases are as follows: - If the resources provided in the buckets argument do not belong to the pool, thus the contribution is invalid. - If any of the buckets provided are empty. - If there are no reserves but the total supply of pool units is not zero (described in the table above). Callable by pool_manager_role Arguments buckets - (Bucket, Bucket): A two-element tuple where each element is a Bucket of the resources to contribute to the pool. Returns - Bucket - A bucket of the pool units minted for the contribution made to the pool. - Option - An optional return of change that is remaining from the contribution to the pool. Note This method takes into account the case where one or both of the resources in the pool have divisibility that is not 18. In this case, the amount of resources that the pool accepts of the resource of non-18 divisibility is always rounded down to the nearest decimal point allowed for by the resource’s divisibility. The amount of pool units minted take this into account. redeem Name redeem Type Method Description Given a Bucket of pool units, this method redeems the pool units for the proportion of the pool that they own. This method is callable by everybody who has pool units and can not be protected. There are certain cases where this method panics and redemption of pool units fails. These cases are as follows: - If the resource in the provided bucket is not the pool unit resource expected by the pool component. Callable By Public Arguments bucket - Bucket: A bucket of the pool units to redeem for some proportion of the pool. Returns (Bucket, Bucket): A tuple of two elements where each element is a Bucket of the proportion of the resources in the pool owed for the pool units. Note This method takes into account the case where one or both of the resources in the pool have divisibility that is not 18. In this case, the amount of resources given back to the caller is always rounded down to fit into the divisibility of the resource. In this case, a pool that gets completely drained out may have some dust remaining in one or more of its vaults. protected_deposit Name protected_deposit Type Method Description Given a Bucket of tokens, this method deposits this bucket into the appropriate vault in the pool. This method is only callable by the pool_manager_role role since it’s considered a method used for the management of funds in the pool. There are certain cases where this method panics and the deposit fails. These cases are as follows: - If the resources in the provided bucket do not belong to the pool. Callable By pool_manager_role Arguments bucket - Bucket: A bucket of the resources to deposit into the pool. Returns Nothing protected_withdraw Name protected_withdraw Type Method Description Given a ResourceAddress and a Decimal amount, this method withdraws the amount from the pool. This method is only callable by the pool_manager_role role since it’s considered a method used for the management of funds in the pool. There are certain cases where this method panics and the withdraw fails. These cases are as follows: - If the provided resource address does not belong to the pool. Callable By pool_manager_role Arguments - resource_address - ResourceAddress: The address of the resource to withdraw from the pool. - amount - Decimal: The amount to withdraw from the pool. - withdraw_strategy - WithdrawStrategy: This argument controls how the withdraw of the resource is to be handled in relation to the divisibility of the resource. If WithdrawStrategy::Exact is used, then it’s the responsibility of the caller to ensure that the provided amount is suitable with the divisibility of the resource. If WithdrawStrategy::Rounded is specified, then it’s the responsibility of the pool to handle the rounding of the given amount to ensure that it’s suitable with the divisibility of the resource. It is recommended to always set this to WithdrawStrategy::Rounded(RoundingMode::ToZero) when calling this method such that your blueprint never runs into any panics. Returns Bucket: A bucket of the withdrawn resources. get_redemption_value Name get_redemption_value Type Method Description Calculates the amount of pool resources that some amount of pool units can be redeemed for. Callable By Public Arguments amount_of_pool_units - Decimal: The amount of pool units to calculate the corresponding amount of pool resources for. Returns BTreeMap: A map of the resources that the pool units can be redeemed for. This is a mapping of the address of the resource to the amount of this resource. get_vault_amounts Name get_vault_amounts Type Method Description Returns the amount of reserves in the pool. Callable By Public Arguments none Returns BTreeMap: A map of the amount of reserves in the pool. This is a mapping of the address of the resource to the amount of this resource. Events ContributionEvent Name ContributionEvent Description An event emitted when resources are contributed to the pool through the contribute method. Fields - contributed_resources - BTreeMap: A map of the resources that the pool has accepted as a contribution. - pool_units_minted - Decimal: The amount of pool units that have been minted as a result of this contribution to the pool. RedemptionEvent Name RedemptionEvent Description An event that is emitted whenever pool units are redeemed from the pool through the redeem method. Fields - pool_unit_tokens_redeemed - Decimal: The amount of pool units that have been redeemed. - redeemed_resources - BTreeMap: The resources that have been redeemed. WithdrawEvent Name WithdrawEvent Description An event that is emitted whenever resources are withdrawn from the pool through the protected_withdraw method. Fields - resource_address - ResourceAddress: The address of the resource that has been withdrawn - amount - Decimal: The amount of the resource that has been withdrawn. DepositEvent Name DepositEvent Description An event that is emitted whenever resources are deposited into the pool through the protected_deposit method. Fields - resource_address - ResourceAddress: The address of the resource that has been deposited. - amount - Decimal: The amount of the resource that has been deposited. Metadata Pool Component Key Value Type Description pool_vault_number u8 The number of vaults that the pool component has. pool_resources Vec The addresses of the resources in the pool. pool_unit GlobalAddress The address of the pool unit resource associated with this pool. Pool Unit Resource Key Value Type Description pool GlobalAddress The address of the pool component that this pool unit resource is associated with. Example A CFMM pool can be built on top of a two-resource pool, requiring the CFMM’s blueprint to only implement the functionality of a CFMM while letting the two-resource pool component handle the proportion of ownership of the pool and providing the liquidity providers with pool unit tokens which are recognized by the Babylon wallet and can be redeemed at any time directly through the wallet. The following example is of a Radiswap blueprint, a CFMM pool blueprint that’s utilizes a two-resource pool. use scrypto::prelude::*; #[blueprint] #[events(InstantiationEvent, AddLiquidityEvent, RemoveLiquidityEvent, SwapEvent)] mod radiswap { struct Radiswap { pool_component: Global, } impl Radiswap { pub fn new( owner_role: OwnerRole, resource_address1: ResourceAddress, resource_address2: ResourceAddress, dapp_definition_address: ComponentAddress, ) -> Global { let (address_reservation, component_address) = Runtime::allocate_component_address(Radiswap::blueprint_id()); let global_component_caller_badge = NonFungibleGlobalId::global_caller_badge(component_address); // Creating a new pool will check the following for us: // 1. That both resources are not the same. // 2. That none of the resources are non-fungible let pool_component = Blueprint::::instantiate( owner_role.clone(), rule!(require(global_component_caller_badge)), (resource_address1, resource_address2), None, ); let component = Self { pool_component } .instantiate() .prepare_to_globalize(owner_role.clone()) .with_address(address_reservation) .metadata(metadata!( init { "name" => "Radiswap", updatable; "dapp_definition" => dapp_definition_address, updatable; } )) .globalize(); Runtime::emit_event(InstantiationEvent { component_address: component.address(), resource_address1, resource_address2, owner_role, }); component } pub fn add_liquidity( &mut self, resource1: Bucket, resource2: Bucket, ) -> (Bucket, Option) { Runtime::emit_event(AddLiquidityEvent([ (resource1.resource_address(), resource1.amount()), (resource2.resource_address(), resource2.amount()), ])); // All the checks for correctness of buckets and everything else is handled by the pool // component! Just pass it the resources and it will either return the pool units back // if it succeeds or abort on failure. self.pool_component.contribute((resource1, resource2)) } /// This method does not need to be here - the pool units are redeemable without it by the /// holders of the pool units directly from the pool. In this case this is just a nice proxy /// so that users are only interacting with one component and do not need to know about the /// address of Radiswap and the address of the Radiswap pool. pub fn remove_liquidity(&mut self, pool_units: Bucket) -> (Bucket, Bucket) { let pool_units_amount = pool_units.amount(); let (bucket1, bucket2) = self.pool_component.redeem(pool_units); Runtime::emit_event(RemoveLiquidityEvent { pool_units_amount, redeemed_resources: [ (bucket1.resource_address(), bucket1.amount()), (bucket2.resource_address(), bucket2.amount()), ], }); (bucket1, bucket2) } pub fn swap(&mut self, input_bucket: Bucket) -> Bucket { let mut reserves = self.vault_reserves(); let input_amount = input_bucket.amount(); let input_reserves = reserves .remove(&input_bucket.resource_address()) .expect("Resource does not belong to the pool"); let (output_resource_address, output_reserves) = reserves.into_iter().next().unwrap(); let output_amount = input_amount .checked_mul(output_reserves) .unwrap() .checked_div(input_reserves.checked_add(input_amount).unwrap()) .unwrap(); Runtime::emit_event(SwapEvent { input: (input_bucket.resource_address(), input_bucket.amount()), output: (output_resource_address, output_amount), }); // NOTE: It's the responsibility of the user of the pool to do the appropriate rounding // before calling the withdraw method. self.deposit(input_bucket); self.withdraw(output_resource_address, output_amount) } fn vault_reserves(&self) -> IndexMap { self.pool_component.get_vault_amounts() } fn deposit(&mut self, bucket: Bucket) { self.pool_component.protected_deposit(bucket) } fn withdraw(&mut self, resource_address: ResourceAddress, amount: Decimal) -> Bucket { self.pool_component.protected_withdraw( resource_address, amount, WithdrawStrategy::Rounded(RoundingMode::ToZero), ) } } } #[derive(ScryptoSbor, ScryptoEvent)] pub struct InstantiationEvent { pub owner_role: OwnerRole, pub resource_address1: ResourceAddress, pub resource_address2: ResourceAddress, pub component_address: ComponentAddress, } #[derive(ScryptoSbor, ScryptoEvent)] pub struct AddLiquidityEvent([(ResourceAddress, Decimal); 2]); #[derive(ScryptoSbor, ScryptoEvent)] pub struct RemoveLiquidityEvent { pub pool_units_amount: Decimal, pub redeemed_resources: [(ResourceAddress, Decimal); 2], } #[derive(ScryptoSbor, ScryptoEvent)] pub struct SwapEvent { pub input: (ResourceAddress, Decimal), pub output: (ResourceAddress, Decimal), } - All three of the pool blueprints come with stubs defined in Scrypto which provides type safety and allows for a rust-like way of invoking methods on the pool components. The possible stubs to use are: OneResourcePool, TwoResourcePool, and MultiResourcePool. - There are two cases where this function can panic: a) if both resources are the same, b) if any of the resources are non-fungible. - The pool does all of the necessary checks to ensure that the correct resources were provided and contains all of the logic for determining how much pool units to mint in return and whether there is any change to return back to the caller. - This method does not need to be here - the pool units are redeemable without it by the holders of the pool units directly from the pool. In this case this is just a nice proxy so that users are only interacting with one component and do not need to know about the address of Radiswap and the address of the Radiswap pool. In the above example, the only method that the Radiswap blueprint needed to implement was the swap method which defines how the resources in the pool can be used by the manager of the pool to conduct an exchange or swap of resources. Additionally, most of the methods on the Radiswap blueprint are pass-through methods implemented purely for a nicer interface. ## Access Controller URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/access-controller Updated: 2026-02-18 Summary: The crypto wallet experience today is unacceptable for all but those who have the crypto enthusiast’s combination of technical savvy and risk tolerance. Particu Reference > Radix Engine > Native Blueprints > Access Controller — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/access-controller.md) The crypto wallet experience today is unacceptable for all but those who have the crypto enthusiast’s combination of technical savvy and risk tolerance. Particularly for DeFi wallets like Metamask, holding assets there and interacting with dApps is fraught with the risk of losing access to accounts or having funds stolen if the user lacks a high degree of op-sec sophistication. The source of the problem is that full control of an account, and the tokens it holds, is provided by a single private key. Maintaining control of this private key requires too much sophistication for the average user and multi-signature mechanisms that might help the problem must be implemented as separate smart contracts that are complex and difficult to use. The access controller blueprint is a native blueprint designed to solve the above-mentioned problems. An access controller holds badges and defines role-based logic around the roles that can create proofs out of the badge and the roles needed to perform multi-factor recovery. More concretely an access controller does the following: - Holds one or more badges of the same resource address in its vault. - Permits specific roles to create proofs out of the badge(s) such that they can have the privileges afforded to this badge. - Defines the multi-factor recovery logic that allows for the role definitions to be changed and swapped with new definitions. The access controller blueprint defines three roles: Primary (P), Recovery (R), and Confirmation (C). Each of these roles plays a different role in the access controller. As an example, the Primary (P) role is the only role that can create proofs out of the badge stored in the access controller, the Primary (P) and Recovery (R) roles are the only roles that can initiate recovery or badge withdrawal, and so on. If one or more of the roles on the access controller is compromised the recovery procedure can be used to change the role definitions to new roles. The recovery procedure requires either: - The Primary (P) or Recovery (R) roles initiate recovery and then any of the three roles can confirm recovery given that the role that confirms the recovery is not the same as the role that initiated the recovery. - If an access controller component allows for timed recovery, then recoveries initiated by the Recovery (R) role can be confirmed by anybody after the timed recovery delay has passed, this delay can be enabled or disabled and configured when instantiating a new access controller component and when performing recovery. Roles The access controller blueprint defines three roles: Primary (P), Recovery (R), and Confirmation (C) and each one of those roles has unique powers and responsibilities. The following is a table of all of the access controller operations and the roles that can perform them: Action Required Role Creation of a proof of the held badge Primary (P) role Initiation of the recovery process Primary (P) role OR Recovery (R) role Confirming an existing recovery proposal Any role that isn’t the proposer Confirming an existing timed recovery Anybody Canceling an existing recovery proposal The proposer Initiation of the badge withdrawal process Primary (P) role OR Recovery (R) role Confirming an existing badge withdrawal attempt Any role that isn’t the proposer Canceling an existing badge withdrawal attempt The proposer Locking and Unlocking of the Primary Role Recovery (R) role Stopping Timed Recovery Primary (P) role OR Recovery (R) role OR Confirmation (C) role From this point onward, this document will refer to the Primary (P) role OR Recovery (R) role as the Proposer Roles as they are the only roles allowed to propose recovery or badge withdrawal. Primary (P) role, Recovery (R) role, and Confirmation (C) role will be referred to as Roles. The Roles are all defined through Scrypto’s powerful AccessRules system. This means that any role can be arbitrarily complex. They do not necessarily need to be a single fungible or non-fungible badge or a single signature. Any AccessRule (within certain limits) can be used for any of the three Roles. The primary role is given the privilege to create proofs from the protected badge that the access controller holds. When the access controller’s create_proof method is called from the manifest, it returns a Proof which is put in the auth zone. The recovery role can lock the primary role taking away its privilege to create proofs of the protected badge, but not taking away any other privileges. Similarly, the recovery role can unlock the primary role giving it back the privilege to create proofs from the protected badge. Recovery Recovery is the process where some of the existing Roles propose and agree on new Roles to use for the access controller. Recovery is proposed by one of the Proposer Roles. Then, it may either be confirmed by one of the Roles that is not the proposer or after the timed recovery delay passes (if it’s defined for the access controller component). After a recovery proposal is submitted and confirmed the old Roles lose their powers and the new Roles take their place. All methods involved in the access controller recovery require that the recovery proposal is passed in again, this is done so that parties are explicit about what recovery role they are confirming. The following is a more complete description of the recovery process and the two steps that it has: - The initiation of recovery: one of the Proposer Roles submits a recovery proposal to the access controller component consisting of the AccessRules they propose for each three of the Roles and of the new timed recovery delay they are proposing. Only the Primary (P) and Recovery (R) roles can initiate recovery. - The confirmation of recovery: after a recovery proposal is submitted by a Proposer Role it is not enacted immediately; it needs to be confirmed. Recovery proposals may be confirmed in one of two ways: - Quick recovery confirmation: this requires 2-of-3 of the roles to agree on a proposal before the access controller enacts the proposal and changes the role definitions and the timed recovery delay. As an example, if the primary role was the role that initiated the recovery process, then the recovery or confirmation roles could confirm the recovery proposal proposed by the primary role. Likewise, if the recovery role initiates the recovery process, the primary and confirmation roles can confirm it’s proposal. The confirmation role can not initiate the recovery process. - Timed recovery confirmation: This allows for recovery proposals submitted by the recovery role to be confirmed by anybody after some delay has passed. Once the delay has elapsed, anybody can call the access controller component to confirm this proposal. Access controller components can be configured to allow or disallow timed recovery. Disallowing timed recovery means that recovery proposals submitted by the recovery roles may not be confirmed by anybody after some delay. On the other hand, with timed recovery enabled, recovery proposals submitted by the recovery role may be confirmed by anybody after the delay has passed. The access controller can be configured to allow or disallow timed recovery using the timed_recovery_delay_in_minutes seen in the access controller’s create function and other methods. Setting it to None disables timed recovery and setting it to Some(u32) enables it. The access controller allows each of the Proposer Roles to have an ongoing recovery that is separate from one another. As an example, the Primary (P) role can initiate recovery and propose one set of roles, and the Recovery (R) role can initiate recovery and propose a different set of roles without overriding the recovery proposal submitted by the Primary (P) role. Both of the proposals co-exist. In a case like the one mentioned in this example, the confirmation role could examine which of the Primary (P) and Recovery (R) role’s proposals appear correct and confirm that particular proposal. Alternatively, if the access controller is configured to allow for timed confirmation, then anybody can confirm the recovery proposal of the Recovery (R) role after the timed recovery delay has elapsed. In summary, at any point in time, the access controller may have a minimum of zero ongoing recoveries and a maximum of two ongoing recoveries (one recovery for each of the Proposer Roles). If there is a currently active timed recovery any of the three roles could choose to stop the timed recovery. Stopping timed recovery only removes the time element from the recovery proposal making it unfit to be confirmed after the delay has passed. However, the proposal remains a valid proposal and the other two roles can confirm it and it can be enacted. Upon the confirmation of a recovery proposal, the proposed role definitions and new timed recovery delay are enacted. Additionally, the access controller goes back to its default state for everything: the primary role is unlocked, all ongoing recoveries are cleared and all ongoing withdrawal attempts are cleared. Withdrawing The Protected Badge This is the process where some of the existing Roles propose and agree to withdraw the badge protected by the access controller; abandoning the access controller and leaving it in a locked state. This works in a very similar way to the Recovery process with the exception that this does not allow for a timed element to badge withdraws, only quick confirmations can advance the withdrawal proposal. The following is a more complete description of the recovery process and the two steps that it has: - The initiation of badge withdrawal: one of the Proposer Roles submits a badge withdrawal request. Only the Primary (P) and Recovery (R) roles can submit this request. - The confirmation of badge withdrawal: this requires that 2-of-3 of the roles agree on the badge withdrawal request for the badge to be withdrawn and returned. Once the badge withdrawal process is confirmed, the badge held by the access controller is returned and the access controller enters lockdown mode where all of its AcessRules are set to DenyAll and the access controller component becomes completely abandoned. The access controller component allows each of the proposer roles to have an ongoing badge withdrawal attempt that is separate from one another. Hierarchical State Machine The following is a Hierarchical State Machine (HSM) that describes the entire logic of access controller components Recovery Badges When an access controller is instantiated a recovery badge non-fungible resource is created. The purpose of this is to be used by the wallet and other clients who wish to utilize it as the recovery badge of the access controller. The choice of whether to use this resource or not is up to the creator of the access controller, using this badge has the added advantage of it being identifiable in the wallet as a recovery badge. This Access Controller’s recovery badge feature allows account owners to easily use badges in the role configuration that have a system-guaranteed link to the Access Controller via metadata, as well as a stable resource address for badges created for use with this Access Controller. Any of the Roles are allowed to mint recovery badges through the mint_recovery_badges method on the access controller. It is the responsibility of the caller to pick the non-fungible local IDs of the badges that they want, the access controller does not pick them on behalf of the caller. However, the engine will check for collisions. If there is a collision, the transaction fails. The following is the complete permission list for the recovery resource that the access controller creates during initialization. Action Rule Mutability mint Access Controller’s Virtual Component Badge DenyAll burn AllowAll DenyAll withdraw DenyAll DenyAll deposit AllowAll DenyAll update_metadata DenyAll DenyAll update_non_fungible_data DenyAll DenyAll The following is the metadata that the recovery badge has: Key Value Type Value access_controller GlobalAddress Address of the access controller that it belongs to name String Recovery Badge icon_url URL https://assets.radixdlt.com/icons/icon-recovery_badge.png (https://assets.radixdlt.com/icons/icon-recovery_badge.png) The access controller has a recovery_badge metadata entry which links back to the resource address of the recovery badge. Some notes on the configuration of the recovery badge: - The access controller recovery badges are freely burnable, the bearer of a badge may get rid of them if their recovery powers are removed at some point. - A recovery badge is not withdrawable, once it’s been deposited into a vault it may not be withdrawn out of that vault, the only possibility is burning it. - The metadata of the recovery badge and the access controller that it belongs to both point to one another. The access_controller field of the recovery badge metadata contains the component address of the access controller and the recovery_badge field of the access controller contains the resource address of the recovery badge. Flows Creation of an Access Controller Say we wish to create an access controller component to protect a badge that controls an account. At a high level, say that we would like for timed recovery to be possible by the recovery role and for it to take 1 day and that we would like to Roles to be defined as follows: Role Rule Primary (P) Role a signature from a key pair that my wallet controls Recovery (R) Role a signature from a key pair that my YubiKey controls Confirmation (C) Role a fungible badge owned by my friend Bob. The fact that the Primary (P) and Recovery (R) roles are using a virtual signature badge and not a resource badge is just an example and is not a requirement by the access controller. The access controller can take full advantage of our powerful and versatile access rules system. We can be more specific about the above high-level intent and translate it down to something closer to implementation. Say that we would like our rule set to be something like the following: Role Rule Primary Role A signature from the Ecdsa Secp256k1 key pair controlled by the wallet whose public key is 0279be667ef9dcbbac55a06295ce870b07029bfcdb2dce28d959f2815b16f81798 Recovery Role A signature from the EdDSA Ed25519 key pair controlled by the YubiKey whose public key is 4cb5abf6ad79fbf5abbccafcc269d85cd2651ed4b885b5869f241aedf0a5ba29 Confirmation Role A fungible badge owned by my friend Bob whose resource address is resource_sim1t5hpqpl8lvyp669wdth8l66nv6uxpa34rk4pmsynhydk89jp0fw2lv The following is a transaction manifest of the flow described above: // Minting the resource that controls the account. This is just an example. MINT_FUNGIBLE Address("${account_badge}") Decimal("1"); TAKE_FROM_WORKTOP Address("${account_badge}") Bucket("bucket1"); // Creating the access controller component CREATE_ACCESS_CONTROLLER Bucket("bucket1") // Badge to protect Tuple( Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxwj8qq5:[b6e84499b83b0797ef5235553eeb7edaa0cea243c1128c2fe737]"))))), // The primary role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:[9f58abcbc2ebd2da349acb10773ffbc37b6af91fa8df2486c9ea]"))))), // The recovery role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<1u8>(Address("resource_sim1t5hpqpl8lvyp669wdth8l66nv6uxpa34rk4pmsynhydk89jp0fw2lv"))))) // The confirmation role ) // Rule set Enum<1u8>(1440u32); // Timed recovery delay in minutes Here are some reflections on the manifest provided above: - Notice that the manifest mints a new ${account_badge} and then use it as the badge protected by the access controller. This is just an example, these couple of instructions should be replaced by the actual logic that gets the actual badge you wish to be protected by the access controller into the worktop and then into a bucket. - Notice that 0279be667ef9dcbbac55a06295ce870b07029bfcdb2dce28d959f2815b16f81798, the Ecdsa Secp256k1 public key associated with the key pair controlled by the wallet does not appear anywhere in the manifest in its raw form. It appears as NonFungibleGlobalId("resource_sim1qgqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq056vhf:[b6e84499b83b0797ef5235553eeb7edaa0cea243c1128c2fe737]") which is the Non-fungible global ID of the Ecdsa Secp256k1 virtual signature badge associated with the public key. - Notice that 4cb5abf6ad79fbf5abbccafcc269d85cd2651ed4b885b5869f241aedf0a5ba29, the EdDSA Ed25519 public key associated with the key pair controlled by the YubiKey does not appear anywhere in the manifest in its raw form. It appears as NonFungibleGlobalId("resource_sim1qgqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs64j5z6:[9f58abcbc2ebd2da349acb10773ffbc37b6af91fa8df2486c9ea]") which is the Non-fungible global ID of the EdDSA Ed25519 virtual signature badge associated with the public key. - The derivation of the non-fungible global ID of the virtual signature badge is a relatively simple process that only requires the ability to hash data using Blake2b and the knowledge of the resource addresses associated with the Ecdsa Secp256k1 and EdDSA Ed25519 curves virtual resources. Nonetheless, the Radix Engine Toolkit provides this derivation. - In this example, we wanted the access controller to allow for timed recovery and for the timed recovery delay period to be one day. The delay period is provided in minutes as the Enum<1u8>(1440u32) argument. Performing Recovery If one of the roles of the access controller is compromised or that some of the roles would like to update the role definition of the access controller recovery can be performed. The recovery process is made up of two steps: - Initiation - Confirmation: which can either be - Quick Confirmation - Timed Confirmation In this section, there are three flows: - A flow that shows how the recovery process can be initiated by the recovery role. - A flow that shows how the confirmation role can confirm the recovery proposal made by the recovery role. - A flow that shows how the recovery role can perform a timed confirmation of their proposal after the timed delay has passed. Initiating Recovery The primary and recovery roles can initiate recovery on the access controller which means that they can submit a proposal to the access controller with the rules that they wish for the access controller to transition to if the recovery proposal is accepted. In this flow, the recovery role wishes to propose that the access controller transitions to the following set of rules: Rule Setting Primary Role A signature from the Ecdsa Secp256k1 key pair controlled by a new wallet whose public key is 02f9308a019258c31049344f85f89d5229b531c845836f99b08601f113bce036f9 Recovery Role A signature from the EdDSA Ed25519 key pair controlled by a new YubiKey whose public key is f381626e41e7027ea431bfe3009e94bdd25a746beec468948d6c3c7c5dc9a54b Confirmation Role A signature from the EdDSA Ed25519 key pair controlled by another YubiKey whose public key is fd50b8e3b144ea244fbf7737f550bc8dd0c2650bbc1aada833ca17ff8dbf329b Timed Recovery Delay Enabled - 7 days (10080 minutes) delay While the recovery role in this example is the role proposing the rule change, it does not always have to be the recovery role. It can also be the primary role. The flow for this would be as follows: - Obtain the badges and/or signatures that make up the recovery role - if we’re continuing from the previous example where we instantiated the access controller then the recovery role would require a signature from an EdDSA Ed25519 key pair whose public key is 4cb5abf6ad79fbf5abbccafcc269d85cd2651ed4b885b5869f241aedf0a5ba29. - Call the initiate_recovery_as_recovery method to initiate the recovery process and submit a recovery proposal as the recovery role specifying the proposed new rule set. Had the primary role been the one initiating recovery, they would have called the initiate_recovery_as_primary method. The following is a transaction manifest of the above-described flow: CALL_METHOD Address("${access_controller_component}") "initiate_recovery_as_recovery" Tuple( Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxwj8qq5:[1c99dfb4448f92a28be31b541cfed52f1b61734e4aefc18914f8]"))))), // The primary role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:[a5ca01ea8e0e59b1c8abdb520edfb19a24571b5a747498cad627]"))))), // The recovery role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:[54fc86e5651ed504d4636e702fa39fbe7fa24d9dbe57212ab073]"))))) // The confirmation role ) // The proposed access rules Enum<1u8>(10080u32); // The timed recovery delay being proposed. Here are some reflections on the above manifest: - In much of a similar way to the manifest seen in the “Creation of an Access Controller” section, the public keys used for the access rules do not appear in their raw form. Instead, the non-fungible global ID of the virtual signature badge associated with the public key is what’s seen in the manifest. - The timed recovery delay is provided as an Enum<1u8>(10080u32). The values seen there mean the following: - The 1u8 means that this is the Some variant of the Option enum. Meaning that the recovery role proposes the access controller should have a timed recovery delay. - The 10080u32 is the timed recovery delay specified in minutes. 10080 minutes translates to 7 days. Quick Confirm Recovery After the recovery role initiated recovery in the flow seen in the “Initiating Recovery” section, we would now like to confirm the recovery proposal through the confirmation role. While this example will show how the confirmation role can confirm a recovery proposal, any role can confirm a proposal that they have not submitted. The confirmation role wishes to confirm the proposal made by the recovery role and it knows the set of rules that the recovery role proposed. It will need to state that set of rules when performing the confirmation. ] The flow for this would be as follows: - Obtain the badges and/or signatures that make up the confirmation role - if we’re continuing from the previous example where we instantiated the access controller then the confirmation role would require a proof of the resource address resource_sim1t5hpqpl8lvyp669wdth8l66nv6uxpa34rk4pmsynhydk89jp0fw2lv be present in the auth zone of the caller. - Call the quick_confirm_recovery_role_recovery_proposal method on the access controller. If the primary role were the role that initiated the recovery process then we would instead call the quick_confirm_primary_role_recovery_proposal method. - Calling the quick confirm method requires the caller to have knowledge of the rules that they’re confirming and to specify them again. - If the rules specified as arguments are not the proposed rules then the access controller panics. The following is a transaction manifest of the above-described flow: CALL_METHOD Address("${access_controller_component}") "quick_confirm_recovery_role_recovery_proposal" Tuple( Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxwj8qq5:[1c99dfb4448f92a28be31b541cfed52f1b61734e4aefc18914f8]"))))), // The primary role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:[a5ca01ea8e0e59b1c8abdb520edfb19a24571b5a747498cad627]"))))), // The recovery role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:[54fc86e5651ed504d4636e702fa39fbe7fa24d9dbe57212ab073]"))))) // The confirmation role ) // The proposed access rules Enum<1u8>(10080u32); // The timed recovery delay being proposed. Here are some reflections on the above manifest: - In much of a similar way to the manifest seen in the “Creation of an Access Controller” section, the public keys used for the access rules do not appear in their raw form. Instead, the non-fungible global ID of the virtual signature badge associated with the public key is what’s seen in the manifest. - When confirming a proposal the confirmer needs to respecify the entire proposal including the rule set and the timed recovery delay. Timed Confirm Recovery After the recovery role initiated recovery in the flow seen in the “Initiating Recovery” section, some time has passed and timed confirmation of the recovery role’s recovery proposal is now possible. Anybody can now confirm the Recovery (R) role’s recovery proposal. This would only require a call to the timed_confirm_recovery method restating the recovery proposal. The following is a transaction manifest of the above-described flow: CALL_METHOD Address("${access_controller_component}") "timed_confirm_recovery" Tuple( Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxwj8qq5:[1c99dfb4448f92a28be31b541cfed52f1b61734e4aefc18914f8]"))))), // The primary role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:[a5ca01ea8e0e59b1c8abdb520edfb19a24571b5a747498cad627]"))))), // The recovery role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:[54fc86e5651ed504d4636e702fa39fbe7fa24d9dbe57212ab073]"))))) // The confirmation role ) // The proposed access rules Enum<1u8>(10080u32); // The timed recovery delay being proposed. Here are some reflections on the above manifest: - In much of a similar way to the manifest seen in the “Creation of an Access Controller” section, the public keys used for the access rules do not appear in their raw form. Instead, the non-fungible global ID of the virtual signature badge associated with the public key is what’s seen in the manifest. - When confirming a proposal the confirmer needs to respecify the entire proposal including the rule set and the timed recovery delay. Stopping Timed Recovery The time element of recovery proposals made by the recovery role can be disabled by any of the three roles if they want. The flow for this would be as follows: - Obtain the badges and/or signatures that make up the primary role, recovery role, or confirmation role. - Call the stop_timed_recovery method restating the recovery proposal. The following is a transaction manifest of the above-described flow: CALL_METHOD Address("${access_controller_component}") "stop_timed_recovery" Tuple( Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1qgqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq056vhf:[1c99dfb4448f92a28be31b541cfed52f1b61734e4aefc18914f8]"))))), // The primary role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1qgqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs64j5z6:[a5ca01ea8e0e59b1c8abdb520edfb19a24571b5a747498cad627]"))))), // The recovery role Enum<2u8>(Enum<0u8>(Enum<0u8>(Enum<0u8>(NonFungibleGlobalId("resource_sim1qgqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs64j5z6:[54fc86e5651ed504d4636e702fa39fbe7fa24d9dbe57212ab073]"))))) // The confirmation role ) // The proposed access rules Enum<1u8>(10080u32); // The timed recovery delay being proposed. Here are some reflections on the above manifest: - In much of a similar way to the manifest seen in the “Creation of an Access Controller” section, the public keys used for the access rules do not appear in their raw form. Instead, the non-fungible global ID of the virtual signature badge associated with the public key is what’s seen in the manifest. - When stopping the timed recovery of a proposal we need to respecify the entire proposal including the rule set and the timed recovery delay. Creation of a Proof from the Access Controller Once the access controller has been instantiated, the primary role can request the access controller to create a proof of the resource it protects. In the case of the access controller protecting the badge that controls the account, it means that the primary role can request a proof from the access controller of this account badge and have access to privileged methods on the account. To perform this, the flow is as follows: - Obtain the badges and/or signatures that make up the primary role. - Call the create_proof method on the access controller component. The following is a transaction manifest of the above-described flow: CALL_METHOD Address("${access_controller_component}") "create_proof"; Locking the Primary Role The primary role can be locked by the recovery role which removes the primary role’s ability to create proofs from the access controller. To perform this, the flow is as follows: - Obtain the badges and/or signatures that make up the recovery role. - Call the lock_primary_role method on the access controller component. The following is a transaction manifest of the above-described flow: CALL_METHOD Address("${access_controller_component}") "lock_primary_role"; Unlocking the Primary Role The primary role can be unlocked by the recovery role which permits the primary role to create proofs from the access controller. To perform this, the flow is as follows: - Obtain the badges and/or signatures that make up the recovery role. - Call the unlock_primary_role method on the access controller component. The following is a transaction manifest of the above-described flow: CALL_METHOD Address("${access_controller_component}") "unlock_primary_role"; API Reference This section is a reference of the interface of the access controller blueprint. Create Creates a new access controller global component that protects the provided badge with the given rule set and timed recovery delay. Name create Type Function Callable By Public Arguments - controlled_asset - Bucket: A bucket of the asset that the access controller is to protect and create proofs of when requested by the primary role. - rule_set - RuleSet: Defines the AccessRules for each of the Primary, Recovery, and Confirmation roles. - timed_recovery_delay_in_minutes† - Option: An optional 32-bit unsigned integer that defines the amount of time in minutes the recovery role needs to wait before doing a timed confirmation of their recovery and enacting the proposed rule set. Returns None CreateProof Creates a proof from the badge that the access controller protects. Name create_proof Type Method Callable By Primary Role Arguments Returns - Proof - a proof of the badge(s) held by the access controller. InitiateRecoveryAsPrimary Initiates the recovery process as the primary role. This method can only be called by the primary role to initiate the recovery process. It takes the proposed rule set and proposed timed recovery delay as arguments and stores them in the access controller’s state. To perform this recovery either the confirmation or the recovery roles need to confirm this recovery through the quick_confirm_primary_role_recovery_proposal method. Timed recovery can not be performed here since it’s something that only the recovery role can do. Name initiate_recovery_as_primary Type Method Callable By Primary Role Arguments - rule_set - RuleSet: The set of AccessRules for each of the Primary, Recovery, and Confirmation roles that the primary role proposes. - timed_recovery_delay_in_minutes - Option: The recovery delay in minutes that the primary role proposes. Returns None InitiateRecoveryAsRecovery Initiates the recovery process as the recovery role. This method can only be called by the Recovery (R) role to initiate the recovery process. It takes the proposed rule set and proposed timed recovery delay as arguments and stores them in the access controller’s state. Name initiate_recovery_as_recovery Type Method Callable By Recovery Role Arguments - rule_set - RuleSet: The set of AccessRules for each of the Primary, Recovery, and Confirmation roles that the primary role proposes. - timed_recovery_delay_in_minutes† - Option: The recovery delay in minutes that the recovery role proposes. Returns None QuickConfirmPrimaryRoleRecoveryProposal Confirms the recovery proposal proposed by the primary role enacting the proposed rules and the proposed delay for timed recovery. Name quick_confirm_primary_role_recovery_proposal Type Method Callable By Recovery Role OR Confirmation Role Arguments - rule_set - RuleSet: The set of rules originally proposed by the primary role. They are provided again to this method so that the caller is aware of what they’re confirming exactly. If the current recovery proposal by the primary role does not match the arguments to this method then the access controller panics and the transaction fails. - timed_recovery_delay_in_minutes - Option: The delay for timed recovery originally proposed by the primary role. This is provided again to this method so that the caller is aware of what they’re confirming exactly. If the current recovery proposal by the primary role does not match the arguments to this method then the access controller panics and the transaction fails. Returns None QuickConfirmRecoveryRoleRecoveryProposal Confirms the recovery proposal proposed by the recovery role enacting the proposed rules and the proposed delay for timed recovery. Name quick_confirm_recovery_role_recovery_proposal Type Method Callable By Primary Role OR Confirmation Role Arguments - rule_set - RuleSet: The set of rules originally proposed by the recovery role. They are provided again to this method so that the caller is aware of what they’re confirming exactly. If the current recovery proposal by the recovery role does not match the arguments to this method then the access controller panics and the transaction fails. - timed_recovery_delay_in_minutes† - Option: The delay for timed recovery originally proposed by the recovery role. This is provided again to this method so that the caller is aware of what they’re confirming exactly. If the current recovery proposal by the recovery role does not match the arguments to this method then the access controller panics and the transaction fails. Returns None CancelPrimaryRoleRecoveryProposal Cancels the recovery proposal proposed by the primary role. Name cancel_primary_role_recovery_proposal Type Method Callable By Primary Role Arguments Returns None CancelRecoveryRoleRecoveryProposal Cancels the recovery proposal proposed by the recovery role. Name cancel_recovery_role_recovery_proposal Type Method Callable By Recovery Role Arguments Returns None StopTimedRecovery Stops the timed recovery proposed by the recovery role. Stopping timed recovery does not completely remove the proposal of the recovery role, it only removes the time element away from it. The proposal remains valid and can be confirmed by the primary or confirmation role through the quick_confirm_recovery_role_recovery_proposal method. Name stop_timed_recovery Type Method Callable By Primary Role OR Recovery Role OR Confirmation Role Arguments - rule_set - RuleSet: The set of rules originally proposed by the recovery role. They are provided again to this method so that the caller is aware of what they’re confirming exactly. If the current recovery proposal by the primary role does not match the arguments to this method then the access controller panics and the transaction fails. Returns None InitiateBadgeWithdrawAttemptAsPrimary Initiates the badge withdrawal process as the primary role of submitting a badge withdrawal attempt. For the badge to be withdrawn, either the confirmation or the recovery roles need to confirm this badge withdrawal attempt through the quick_confirm_primary_role_badge_withdraw_attempt method. While the recovery process allows the recovery role to perform timed recovery on access controllers configured to allow timed recovery, this is not the same for badge withdraws. They do not have a time element and can only proceed further when another role confirms them. Name initiate_badge_withdraw_attempt_as_primary‡ Type Method Callable By Primary Role Arguments Returns None InitiateBadgeWithdrawAttemptAsRecovery Initiates the badge withdrawal process as the recovery role submitting a badge withdrawal attempt. For the badge to be withdrawn, either the confirmation or the recovery roles need to confirm this badge withdrawal attempt through the quick_confirm_recovery_role_badge_withdraw_attempt method. While the recovery process allows the recovery role to perform timed recovery on access controllers configured to allow timed recovery, this is not the same for badge withdraws. They do not have a time element and can only proceed further when another role confirms them. Name initiate_badge_withdraw_attempt_as_recovery‡ Type Method Callable By Recovery Role Arguments Returns None QuickConfirmPrimaryRoleBadgeWithdrawAttempt Confirms the badge withdrawal attempt made by the primary role and returns the badge that the access controller protects in a Bucket. Once the badge is withdrawn, the access controller goes into complete lockdown, and the rules for all of the methods are set to DenyAll so the controller can no longer be used again. Name quick_confirm_primary_role_badge_withdraw_attempt‡ Type Method Callable By Recovery Role OR Confirmation Role Arguments Returns None QuickConfirmRecoveryRoleBadgeWithdrawAttempt Confirms the badge withdraw attempt made by the recovery role and returns the badge that the access controller protects in a Bucket. Once the badge is withdrawn, the access controller goes into complete lockdown, and the rules for all of the methods are set to DenyAll so the controller can no longer be used again. Name quick_confirm_recovery_role_badge_withdraw_attempt‡ Type Method Callable By Primary Role OR Confirmation Role Arguments Returns None CancelPrimaryRoleBadgeWithdrawAttempt Cancels the badge withdraw attempt made by the primary role. Name cancel_primary_role_badge_withdraw_attempt Type Method Callable By Primary Role Arguments Returns None CancelRecoveryRoleBadgeWithdrawAttempt Cancels the badge withdraw attempt made by the recovery role. Name cancel_recovery_role_badge_withdraw_attempt‡ Type Method Callable By Recovery Role Arguments Returns None LockPrimaryRole Locks the primary role. This removes the primary role’s ability to create proofs from the badge guarded by the access controller. Name lock_primary_role Type Method Callable By Recovery Role Arguments Returns None UnlockPrimaryRole Unlocks the primary role. This gives the ability back for the primary role to create proofs from the badge guarded by the access controller. Name unlock_primary_role Type Method Callable By Recovery Role Arguments Returns None MintRecoveryBadges Mints recovery badges with the non-fungible local ids specified as an argument. Name mint_recovery_badges‡ Type Method Callable By Primary Role OR Recovery Role Arguments - non_fungible_local_ids - IndexSet: The set of non-fungible local ids that the caller wishes to mint of the recovery badges. The recovery badge resource only accepts integer non-fungible local ids. If a non-integer type is specified the engine panics and execution fails. Returns None ContributeRecoveryFee A public method on access controllers that’s callable by everybody that allows users to deposit some XRD into the access controller’s fee vault. Name contribute_recovery_fee Type Method Callable By Public Arguments - bucket - Bucket: A bucket of XRD to deposit into the access controller's recovery vault. Returns None LockRecoveryFee A method callable by either the primary, recovery, or confirmation roles to lock some XRD for fees from the access controller’s XRD fee vault. Name lock_recovery_fee Type Method Callable By Primary Role OR Recovery Role OR Confirmation Role Arguments - amount - Decimal: The amount of XRD to lock for fees from the access controller’s XRD fee vault. Returns None WithdrawRecoveryFee A method callable by the primary role to withdraw XRD from the recovery fees vault. Name withdraw_recovery_fee Type Method Callable By Primary Role Arguments - amount - Decimal: The amount of XRD to withdraw from the access controller’s XRD fee vault. Returns - Bucket - A bucket of the withdrawn XRD. ## Account URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/account Updated: 2026-02-18 Summary: Unlike most blockchain platforms, an account on Radix are not simply associated with your public and private key. Instead, an account is a component, instantiat Reference > Radix Engine > Native Blueprints > Account — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) Unlike most blockchain platforms, an account on Radix are not simply associated with your public and private key. Instead, an account is a component, instantiated from a built-in account blueprint provided by the system which exist in the application layer. Even though accounts are implemented in the application layer, account components are unique in that they’re afforded some unique features that are not afforded to other normal components. As a result, accounts on Radix can contain resources and have special logic built into it. Account Authentication An account has an owner which controls the account, and the owner of the account is able to: - Lock a fee from the XRD vault that the account may hold to pay for transaction fees. - Manage the account by withdrawing and depositing resources. - Create proofs against resources that the account stores. - Configure which deposits the account accepts and rejects. Furthermore, account components have a security model which employs a Role Based Access Control (RBAC) model where there are pre-configured roles which are mapped to privileged methods that only these roles have access to. These are roles to delineate who is allowed to do what with the account. The account blueprint has two pre-configured roles by default: Owner and Securify. - The Owner role is given the ability to call all the privileged methods on the account (such as methods that withdraw and deposit resources or lock XRD for fee payment). - The Securify role is the role that can call the appropriate methods that "securifies" the account. This allows to expand the accounts authorization model and enables things like multi-factor control. More information on account securificaton is provided in the Account Securification (account.md#account-securification) section. While the Owner can expand who is allowed to access their account by re-configuring the roles, by default at instantiation, both roles are associated to the owner. The diagram below shows a complete list of the account methods and the roles that they map to, in other terms, it shows the roles that are authorized to call these methods. It also shows the mapping of the roles to the access rules. All the methods seen in the diagram below are explained in detail in the API Reference (account.md#api-reference) section of the document. How the Account Component Works Use of Radix accounts is done through calls to component methods using the transaction manifest (as with any component). For example, a simple transfer of 10 XRD tokens from Alice’s account to Bob’s is accomplished by creating a transaction manifest (transaction-manifest) that describes these steps: - Lock fees to pay for the transaction - Call the withdraw method on Alice’s account component, requesting 10 XRD - Take the returned tokens from the worktop and put them in a bucket - Pass this bucket to the try_deposit_or_x method of Bob’s account component As long as Alice’s account does in fact have 10 XRD to withdraw (and is authorized to withdraw from that component - more on this below), the 10 XRD are returned and deposited in Bob’s account (as long as Bob’s account is not configured to deny XRD). If any of these assumptions are incorrect, the whole transaction fails. In this way, using a Radix-style account is intuitively like getting cash from your wallet when you want to pay for something. Account Addresses and Pre-allocation There are two main types of account address: - Addresses for pre-allocated accounts, which are implicitly instantiated via on-ledger interactions. These addresses encode either a Secp256k1 or Ed25519 public key hash, and at instantiation time, the initial Owner role is assigned to require a signature of the corresponding public key. - Addresses for explicitly-allocated accounts. The difference between these addresses is what happens if no account has been instantiated at that address yet. A pre-allocated account effectively already exists in a “virtual” state before it is instantiated, and the Core API and Gateway API return data for an un-instantiated pre-allocated account address in this “virtual” state, as if it already existed. The first time it is interacted with on ledger (say, because it receives a deposit), the account shell gets instantiated automatically. The transaction which interacts with that account for the first time pays additional fees to instantiate the account. The following is the algorithm used to derive the pre-allocated account address associated with a public key. This method is also made available by the Radix Engine Toolkit: - Take an Ecdsa compressed Secp256k1 public key, or the standard Ed25519 public key. - Hash the public key through Blake2b with 256 bit digests. - Construct a 30 byte array by setting the first byte to 0xD1 for a Secp256k1 public key or 0x51 for an Ed25519 public key, and then appending the last 29 bytes of the public key hash. - Bech32m encode the above with the account_${network_specifier} HRP where the network_specifier depends on the network that the address will be used for (see addressing (addressing-on-radix) ). Account Metadata and Owner Keys There are various standards for Account metadata: - Accounts can be configured as a dApp Definition with certain metadata (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) - For user accounts, the owner_keys property is used for ROLA (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/rola-radix-off-ledger-auth.md) verification - In the future, a new standard metadata property will be published for setting encryption keys for message encryption. Configuring Account Deposit Modes and Resource Preference Map There are two types of deposits for two different sets of callers: - Privileged methods such as deposit and deposit_batch which accepts all deposits and can only be called by the account owner. - Unprivileged methods such as try_deposit_or_abort, try_deposit_or_refund, try_deposit_batch_or_abort, and try_deposit_batch_or_refund that are reserved for third-parties who wish to deposit resources to an account. Because privileged deposit* methods can only be called by the account owner, third-parties who wish to deposit resources to an account are required to use try_deposit* methods. The try_deposit* methods are reserved to allow account owners to configure how resources deposited to their account by third-parties are treated. As such, there are two settings that the owner of an account component can set up to configure how resources deposited are treated: the Resource Preference Map and the Account Deposit Mode. - The Resource Preference Map is a granular per resource configuration that account owners can specify. - The Account Deposit Mode is a fall-back configuration which determines how resources deposited into an account are broadly treated. The resource preference map is the primary and first place that the account looks at when determining if a resource can be deposited or not and always supersedes configurations within the account deposit mode or any other account state. Resource Preference Map The account component contains a "resource preference" map in its state. This map stores the preference configuration of each specific resource and can be thought of as the “allow list” and “deny list”. Technically speaking, the resource preference map is defined in code as KeyValueStore where each address of a resource (indicated by their ResourceAddress) is mapped to a ResourcePreference which is an enum variant of Allowed and Disallowed. In summary: - If the ResourcePreference for some resource is Allowed then it is guaranteed to be deposited into the account, if the ResourcePreference for some resource is Disallowed then its guaranteed to be rejected by the account component, no other state matters. - A resource cannot be in the “allow list” and “deny list” at the same time since a KeyValueStore does not allow for duplicate entries with the same key, thus, a resource can only be: Allowed, Disallowed, or has no entry in the resource preferences map. The resource preference map of an account component can be configured through the set_resource_preference and remove_resource_preference methods by the owner. It’s important to note that resources are not special-cased to this resource preference mapping. Any resource can be put as a Allowed resource and any resource can be put as a Disallowed resource. This means even XRD can be Disallowed from being deposited into an account through unprivileged deposit methods. Account Deposit Mode When an account doesn’t have a specific resource configured in the resource preference map, the account component then uses a broader mechanism: account deposit mode. The account deposit mode is a default treatment of resource deposits that are not specified in the resource preference map. An account deposit mode can be configured in the following ways: - Accept: If the account doesn’t have a preference for a particular resource then permit the deposit of the resource. - Reject: If the account doesn’t have a preference for a particular resource then reject the deposit of the resource. - Allow Existing (or XRD): If the account doesn’t have a preference for a particular resource then permit the deposit of the resource IF the account has ever held the resource before or the resource is XRD. - Any resource that the account has a vault for can be deposited into the account while in this mode even if the vault in the account is empty for that particular resource. - If the user wishes to make exceptions to this (to e.g. prevent XRD deposits, or deposits of a previously-held resource), they can configure these explicitly using the resource preference map. The default deposit mode of an account can be configured through the set_default_deposit_rule method by the owner. In summary: - If a resource isn’t configured in the resource preference map then the account’s default deposit mode comes into the picture which can either be Accept, Reject, or AllowExisting. Authorized Depositors With these account deposit configuration in mind, another feature to note is that the owner of an account can also specify authorized depositors. When owners specify authorized depositors to their account, the authorized depositors have special privileges which allows resource deposits into the account regardless of the account resource preferences and deposit mode setting. This effectively gives authorized depositors the same deposit privileges as the owner. Except of course the privileges starts and stops there and the owner can always revoke those privileges. A small caveat is that, authorized depositors will still need to use try_deposit* methods even with their deposit privileges as the protected deposit* methods are only reserved for the owner. When the owner wants to add an authorized depositor, they may do so by calling the add_authorized_depositor method. When calling this method, the owner can specify the ResourceOrNonFungible of an existing badge that the prospective authorized depositor may have or create one for the prospective authorized depositor to receive. The ResourceOrNonFungible is an input which accepts either a ResourceAddress if the badge used is a fungible resource or a NonFungibleGlobalId if the badge used is a non-fungible resource. Once specified, the badge will not be added to the account’s list of authorized depositor and the authorized depositor must have the badge present when making privileged deposits to the account. For example, if the authorized depositor deposits a resource that have been specified as Disallowed in the resource preference map or that the account deposit mode is set to Reject, the account does the following: - Check if the badge passed is in the account’s set of authorized depositors. - Assert the presence of this badge in the auth zone. - Permit the deposit. The following is a complete flow chart showing the logic to determine if a deposit is allowed or not. Account Securification Owners of an account component are defined by the Owner role which is mapped to an access rule. By default, the access rule is configured to its key pairs derived from a seed phrase to control the account. Thereby, the process of securifying the account is to update the associated access rule of the Owner from its default setting to a new one which instead specifies a badge (otherwise known as an "owner badge") to control the account. An account can be securified by calling the securify method on the account, this method returns a bucket that contains the account’s owner badge (which the Owner access rule is now updated to). This badge must then be stored somewhere, ideally in an access controller. The securify method is only callable by the Securify role. When an account is first created, the Securify role is pre-configured to also be the Owner. However, after the account has been securified the Securify role is re-configured to DenyAll, meaning it can’t ever be changed again and the method can no longer be called. Effectively, the securification process is the process of switching from signature mode (from key pairs) to badge mode, this is because the process changes the owner’s access rule from requiring a signature to requiring a badge instead. Switching the account authorization from signature mode to badge mode offers expanded authorization configuration for the owner as detailed in this article: How Multi-Factor Radix Smart Accounts Work and What They Can Do (https://www.radixdlt.com/blog/how-radix-multi-factor-smart-accounts-work-and-what-they-can-do) . The securification process is an important process for the wallet. The wallet will have a dedicated flow for securifying accounts and creating an access controller to store the account’s owner badge into. The API Reference section contains an example of a manifest that securifies an account. Blueprint API - Function Reference Account Component Rust Docs (https://docs.rs/scrypto/1.2.0/scrypto/component/struct.Account.html) Create Name create Type Function Callable by Public Arguments None Returns - ComponentAddress: The component address of the account component instantiated by this function. - Bucket: A bucket containing the account owner badge associated with this account. Description Creates a new global securified explicitly-allocated account and returns the ComponentAddress of the account and a Bucket of the account’s owner badge. This function is used to create a new global securified explicitly-allocated account. It mints an owner badge and sets it as the owner role giving it the authority to call privileged methods on the account. This function will never be called by the wallet for any of the wallet flows. It is documented here for the sake of completeness only. The wallet only creates pre-allocated accounts and none of the current flows include the use of explicitly-allocated accounts. Transaction Manifest CREATE_ACCOUNT; TAKE_ALL_FROM_WORKTOP Address("${account_owner_badge_address}") Bucket("owner_badge"); # Do something with the owner badge, ideally create an access controller and deposit # it there. Note that the transaction manifest above is not complete, more specifically, the account owner badge returned from the create function is not deposited anywhere in this manifest. Ideally, the account owner badge would be stored in an access controller. Create Advanced Name create_advanced Type Function Callable by Public Arguments owner_role - OwnerRole: The role definition of the account owner. Returns ComponentAddress: The component address of the account component instantiated by this function. Description Creates a new global allocated account with the owner rule specified by the caller and returns the ComponenAddress of the account component. While the create function automatically mints an owner badge and sets that as the owner role, this function allows the caller to specify the AccessRule associated with the owner role giving the caller more freedom on who can call privileged methods on the account. This is useful for any application where the creator of the account wishes to have some kind of an m-of-n multi-signature account, a 1-of-n account, or an account whose owner is any arbitrarily complex AccessRule. An example of where this might be used is for exchanges which might want to have an account controlled by 4-of-6 signatures to ensure that a single compromised key does not result in the loss of funds. Most users of this function do not particularly need to have the multi-factor authentication and recovery logic of an access controller. This function will never be called by the wallet for any of the wallet flows. It is documented here for the sake of completeness only. The wallet only creates pre-allocated accounts and none of the current flows include the use of global explicitly-allocated accounts. Transaction Manifest CREATE_ACCOUNT_ADVANCED Enum( # Contrived example to show AccessRule configuration Enum() ); Component API - Method Reference Lock fee Name lock_fee Type Method Callable by Owner role Arguments amount - Decimal: The amount of XRD to lock for fees. Returns None Description Locks some amount of XRD in fees from the account’s XRD vault. Transaction Manifest CALL_METHOD Address("${account_component_address}") "lock_fee" Decimal("${amount_of_xrd_to_lock_for_fees}"); Lock contingent fee Name lock_contingent_fee Type Method Callable by Owner role Arguments amount - Decimal: The amount of XRD to lock for fees. Returns None Description Locks some amount of XRD in fees from the account’s XRD vault which is contingent on the success of the transaction. If the transaction succeeds, then the locked XRD may be used for fees, if the transaction fails then the locked XRD is not used for fees. Because of this restriction, this fee doesn’t count towards payment during the transaction itself, so can’t be used on its own to pay the transaction fee during execution. Transaction Manifest CALL_METHOD Address("${account_component_address}") "lock_contingent_fee" Decimal("${amount_of_xrd_to_lock_for_fees}"); Deposit Name deposit Type Method Callable by Owner role Arguments bucket - Bucket: The bucket of resources to deposit into the account. Returns None Description Deposits a bucket of resources into the account. This method is only callable by the account owner and does not do any of the checks discussed in the Account Deposit Modes section. It permits all deposits since it requires the owner authority to be present when calling it. This method is intended to be used for transactions (such as dApp transactions) where the owner is present, and skips deposit mode checks. If building a transfer or deposit where the owner isn’t present, use try_deposit_or_abort instead. If the transaction needs to continue if the deposit doesn’t succeed (e.g. in an automated airdrop scenario), use try_deposit_or_refund instead. Transaction Manifest CALL_METHOD Address("${account_component_address}") "deposit" Bucket("some_bucket"); Deposit batch Name deposit_batch Type Method Callable by Owner role Arguments buckets - Vec: The buckets of resources to deposit into the account. Returns None Description Deposits multiple buckets of resources into the account. This method is identical to deposit but deposits a vector of buckets instead of depositing a single bucket. This method is intended to be used for transactions (such as dApp transactions) where the owner is present, and skips deposit mode checks. If building a transfer or deposit where the owner isn’t present, use try_deposit_batch_or_abort instead. If the transaction needs to continue if the deposit doesn’t succeed (e.g. in an automated airdrop scenario), use try_deposit_batch_or_refund instead. Transaction Manifest CALL_METHOD Address("${account_component_address}") "deposit_batch" Expression("ENTIRE_WORKTOP"); Try deposit or abort Name try_deposit_or_abort Type Method Callable by Public Arguments - bucket - Bucket: The bucket of resources to attempt to deposit into the account. - authorized_depositor_badge - Option: An optional parameter of authorized depositor badge to use for this deposit. If specified, then it will be checked and used if the deposit can’t go through without it. Returns None Description Attempts to deposit resources in the account, aborting the transaction if the deposit fails because the account has been configured to disallow the deposit. This method is intended to be used for transfers or deposits where the owner isn’t present. If the owner is present, use deposit instead. If the transaction needs to continue if the deposit doesn’t succeed (e.g. in an automated airdrop scenario), use try_deposit_or_refund instead. Transaction Manifest CALL_METHOD Address("${account_component_address}") "try_deposit_or_abort" Bucket("some_bucket") None; Try deposit or refund Name try_deposit_or_refund Type Method Callable by Owner role Arguments - bucket - Bucket: The bucket of resources to attempt to deposit into the account. - authorized_depositor_badge - Option: An optional parameter of authorized depositor badge to use for this deposit. If specified, then it will be checked and used if the deposit can’t go through without it. Returns Option: An optional bucket of resources. This is Some if the deposit failed and the bucket is being returned and None if the deposit succeeded. Description Attempts to deposit resources in the account, refunding them returns them if the deposit fails. This method attempts to deposit a bucket of resources into the account, if the account is configured to disallow deposits of this resource then they’re returned and refunded back as a bucket. This method is intended to be used for automated airdrop scenarios where the owner isn’t present. If the owner is present, use deposit instead. For transfers, use try_deposit_or_abort instead. Using this method causes the manifest to be non-conforming (conforming-transaction-manifest-types) . Transaction Manifest CALL_METHOD Address("${account_component_address}") "try_deposit_or_refund" Bucket("some_bucket") None; Try deposit batch or abort Name try_deposit_batch_or_abort Type Method Callable by Public Arguments - buckets - Vec: The buckets to attempt to deposit into the account. - authorized_depositor_badge - Option: An optional parameter of authorized depositor badge to use for this deposit. If specified, then it will be checked and used if the deposit can’t go through without it. Returns None Description Attempts to deposit buckets of resources into the account aborts the transaction if any of them can’t be deposited. This method attempts to deposit buckets of resources into the account, if the account is configured to disallow deposit of any the resources then the transaction aborts. This method is intended to be used for transfers or deposits where the owner isn’t present. If the owner is present, use deposit_batch instead. If the transaction needs to continue if the deposit doesn’t succeed (e.g. in an automated airdrop scenario), use try_deposit_batch_or_refund instead. Transaction Manifest CALL_METHOD Address("${account_component_address}") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") None; Try deposit batch or refund Name try_deposit_batch_or_refund Type Method Callable by Public Arguments - buckets - Vec: The buckets to attempt to deposit into the account. - authorized_depositor_badge - Option: An optional parameter of authorized depositor badge to use for this deposit. If specified, then it will be checked and used if the deposit can’t go through without it. Returns Vec: A vector of buckets which is empty if all the resources could be deposited and has the same length as the arguments if any of the resources could not be deposited. Description Attempts to deposit buckets of resources into the account and refunds all of them if any of them can’t be deposited. This method attempts to deposit buckets of resources into the account, if the account is configured to disallow deposit of any the resources then they’re all returned and refunded back as buckets. This method is intended to be used for automated airdrop scenarios where the owner isn’t present. If the owner is present, use deposit_batch instead. For transfers, use try_deposit_batch_or_abort instead. Using this method causes the manifest to be non-conforming (conforming-transaction-manifest-types) . Transaction Manifest CALL_METHOD Address("${account_component_address}") "try_deposit_batch_or_refund" Expression("ENTIRE_WORKTOP") None; Withdraw Name withdraw Type Method Callable by Owner role Arguments - resource_address - ResourceAddress: The resource address of the resource to withdraw from the account. - amount - Decimal: The amount to withdraw from the account. Returns Bucket: A bucket of the withdrawn resources. Description Withdraws resources from the account by amount. This method withdraws a resource of the given address and amount from the account vaults and returns it in a Bucket. Transaction Manifest CALL_METHOD Address("${account_component_address}") "withdraw" Address("${resource_address}") Decimal("${amount}"); Withdraw non-fungibles Name withdraw_non_fungibles Type Method Callable by Owner role Arguments - resource_address - ResourceAddress: The resource address of the resource to withdraw from the account. - ids - BTreeSet: The set of non-fungible local ids of the resource to withdraw from the account. Returns Bucket: A bucket of the withdrawn resources. Description Withdraws resources from the account by NonFungibleLocalIds. This method withdraws a resource of the given address and non-fungible local ids from the account vaults and returns it in a Bucket. Transaction Manifest CALL_METHOD Address("${account_component_address}") "withdraw_non_fungibles" Address("${resource_address}") Array(NonFungibleLocalId("${some_non_fungible_local_id}")); Lock fee and withdraw Name lock_fee_and_withdraw Type Method Callable by Owner role Arguments - amount_to_lock - Decimal: The amount of XRD to lock for fees. - resource_address - ResourceAddress: The resource address of the resource to withdraw from the account. - amount - Decimal: The amount to withdraw from the account. Returns Bucket: A bucket of the withdrawn resources. Description Locks some amount of XRD for fees and withdraws resources from the account by amount. This is a composite method which calls both lock_fee and withdraw in a single call which makes this method slightly cheaper to use when we wish to lock some XRD for fees and also withdraw resources form the account. Transaction Manifest CALL_METHOD Address("${account_component_address}") "lock_fee_and_withdraw" Decimal("${amount_of_xrd_to_lock_for_fees}") Address("${resource_address}") Decimal("${amount}"); Lock fee and withdraw non-fungibles Name lock_fee_and_withdraw_non_fungibles Type Method Callable by Owner role Arguments - amount_to_lock - Decimal: The amount of XRD to lock for fees. - resource_address - ResourceAddress: The resource address of the resource to withdraw from the account. - ids - BTreeSet: The set of non-fungible local ids of the resource to withdraw from the account. Returns Bucket: A bucket of the withdrawn resources. Description Locks some amount of XRD for fees and withdraws resources from the account by non-fungible local ids This is a composite method which calls both lock_fee and withdraw_non_fungibles in a single call which makes this method slightly cheaper to use when we wish to lock some XRD for fees and also withdraw resources form the account. Transaction Manifest CALL_METHOD Address("${account_component_address}") "lock_fee_and_withdraw_non_fungibles" Decimal("${amount_of_xrd_to_lock_for_fees}") Address("${resource_address}") Array(NonFungibleLocalId("${some_non_fungible_local_id}")); Get balance Name balance Type Method Callable by Public Arguments - resource_address - ResourceAddress: The fungible or non-fungible resource address to read the balance of Returns Decimal: The total balance of the resource in the account Description Reads the total balance of the resource in the account. Returns zero if the resource isn’t present in the account. First available at the Cuttlefish (cuttlefish) protocol update. Scrypto let account: Global = ...; let balance = account.balance(resource_address); Get non-fungible local ids Name non_fungible_local_ids Type Method Callable by Public Arguments - resource_address - ResourceAddress: The non-fungible resource address to read the non-fungible local ids of - limit - u32: The total number of non-fungible ids to read Returns Vec: A list of the top limit non-fungible local ids (or all that exist) Description Reads the non-fungible local ids in an account, up to some limit. Returns an empty list of the resource isn’t present in the account. First available at the Cuttlefish (cuttlefish) protocol update. Scrypto let account: Global = ...; let local_ids = account.non_fungible_local_ids(resource_address, 10); Has non-fungible Name has_non_fungible Type Method Callable by Public Arguments - resource_address - ResourceAddress: The non-fungible resource address to check - local_id - NonFungibleLocalId: The local id to check Returns bool: Returns if the non-fungible local id is in the account Description Checks if the given non-fungible id is in the account. First available at the Cuttlefish (cuttlefish) protocol update. Scrypto let account: Global = ...; let is_present_in_account = account.has_non_fungible(resource_address, local_id); Create proof of amount Name create_proof_of_amount Type Method Callable by Owner role Arguments - resource_address - ResourceAddress: The resource address of the resource to create a proof of. - amount - Decimal: The amount of the resource to create a proof of. Returns Proof: A proof of the specified quantity and resource. Description Creates a proof of the specified resource and amount. This method creates a Proof of the resource and amount specified to the method and returns it. Transaction Manifest CALL_METHOD Address("${account_component_address}") "create_proof_of_amount" Address("${resource_address}") Decimal("${amount}"); Create proof of non-fungibles Name create_proof_of_non_fungibles Type Method Callable by Owner role Arguments - resource_address - ResourceAddress: The resource address of the resource to create a proof of. - ids - Array: The set of non-fungible local ids of the resource to create a proof of. Returns Proof: A proof of the specified non-fungible local ids and resource. Description Creates a proof of the specified resource of and non-fungible local ids. This method creates a Proof of the resource and non-fungible local ids specified to the method and returns it. Transaction Manifest CALL_METHOD Address("${account_component_address}") "create_proof_of_non_fungibles" Address("${resource_address}") Array(NonFungibleLocalId("${some_non_fungible_local_id}")); Burn Name burn Type Method Callable by Owner role Arguments - resource_address - ResourceAddress: The resource address of the resource to burn from the account. - amount - Decimal: The amount of the resource to burn. Returns None Description Burns the amount of the resource directly from the account’s vault. Transaction Manifest CALL_METHOD Address("${account_component_address}") "burn" Address("${resource_address}") Decimal("${amount}"); Burn non-fungibles Name burn_non_fungibles Type Method Callable by Owner role Arguments - resource_address - ResourceAddress: The resource address of the resource to burn. - ids - Array: The set of non-fungible local ids of the resource to burn. Returns None Description Burns the non-fungibles of the resource directly from the account’s vault. Transaction Manifest CALL_METHOD Address("${account_component_address}") "burn_non_fungibles" Address("${resource_address}") Array(NonFungibleLocalId("${some_non_fungible_local_id}")); Set default deposit rule Name set_default_deposit_rule Type Method Callable by Owner role Arguments default - DefaultDepositRule: Describes how the account should deal with resources that it does not have specific rules for. This could either be Accept, Reject, or AllowExisting. Returns None Description Sets the default deposit rule of the account This method changes the default deposit rule of the account changing the behavior of how the account handles deposits from third-parties of resources that it does not have a specific rule for. Transaction Manifest CALL_METHOD Address("${account_component_address}") "set_default_deposit_rule" Enum(); A more complete manifest example of this can be found here (https://github.com/radixdlt/radixdlt-scrypto/blob/develop/radix-transactions/examples/account/deposit_modes.rtm) . Set resource preference Name set_resource_preference Type Method Callable by Owner role Arguments - resource_address - ResourceAddress: The address of the resource to add a resource preference for. - resource_preference - ResourcePreference: Describes how the account how deal with deposits of this resource. This is either Allowed, or Disallowed. Returns None Description Sets the resource preference of a resource. This method sets and overrides the preference of the resource in the account. Transaction Manifest CALL_METHOD Address("${account_component_address}") "set_resource_preference" Address("${resource_address}") Enum(); A more complete manifest example of this can be found here (https://github.com/radixdlt/radixdlt-scrypto/blob/develop/radix-transactions/examples/account/deposit_modes.rtm) . Remove resource preference Name remove_resource_preference Type Method Callable by Owner role Arguments resource_address - ResourceAddress: The address of the resource to add a resource preference for. Returns None Description Removes the preference of a resource making it use the default deposit rule instead. If no preference for this resource exists then nothing happens. Transaction Manifest CALL_METHOD Address("${account_component_address}") "remove_resource_preference" Address("${resource_address}"); Add authorized depositor Name add_authorized_depositor Type Method Callable by Owner role Arguments badge - ResourceOrNonFungible: The badge of the authorized depositor to add specified as a ResourceOrNonFungible. Returns None Description Adds an authorized depositor badge to the set of authorized depositors. Transaction Manifest CALL_METHOD Address("${account_component_address}") "add_authorized_depositor" Enum<1u8>(Address("${resource_address}")); A more complete manifest example of this can be found here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transaction-scenarios/generated-examples/bottlenose/account_authorized_depositors/manifests/001--account-authorized-depositors-configure-accounts.rtm) . Remove authorized depositor Name remove_authorized_depositor Type Method Callable by Owner role Arguments badge - ResourceOrNonFungible: The badge of the authorized depositor to remove specified as a ResourceOrNonFungible. Returns None Description Removes an authorized depositor badge to the set of authorized depositors. Transaction Manifest CALL_METHOD Address("${account_component_address}") "remove_authorized_depositor" Enum<1u8>(Address("${resource_address}")); A more complete manifest example of this can be found here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transaction-scenarios/generated-examples/bottlenose/account_authorized_depositors/manifests/001--account-authorized-depositors-configure-accounts.rtm) . Securify Name securify Type Method Callable by Securify role Arguments None Returns Bucket: A bucket containing the account owner badge associated with this account. Description Securifies the account, transitioning it from operating in signature mode to operating in badge mode. This method securifies the account minting a new account owner badge and changing the account’s current owner access rule to a new access rule of the account owner badge and returns the minted owner badge. The returned badge then must be stored somewhere, ideally in an access controller. This method is only callable by the Securify role. When an account is first instantiated, the Securify role requires the Owner authority. However, after the account has been securified the Securify role changes to being DenyAll such that securification can never happen again. Transaction Manifest CALL_METHOD Account("${account_component_address}") "securify"; TAKE_ALL_FROM_WORKTOP Address("${account_owner_badge_address}") Bucket("owner_badge"); # Do something with the owner badge, ideally create an access controller and deposit # it there. Note that the transaction manifest above is not complete, more specifically, the account owner badge returned from the securify method is not deposited anywhere in this manifest. Ideally, the account owner badge would be stored in an access controller. ## Fungible Resource Manager URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/fungible-resource-manager Updated: 2026-02-18 Summary: This document offers a description of the design and implementation of the Fungible Resource Manager blueprint. Additionally, this document provides an API refe Reference > Radix Engine > Native Blueprints > Fungible Resource Manager — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/fungible-resource-manager.md) This document offers a description of the design and implementation of the Fungible Resource Manager blueprint. Additionally, this document provides an API reference for all its methods, functions and events. Background In Radix, a Resource is a native concept used to implement use-cases typically associated with “tokens” or “assets”. You can learn more about its high-level design at Resources (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/README.md) . A “Fungible Resource” is a kind of Resource operating on arbitrary quantities, which can be split and combined freely (i.e. its units do not have distinct identity nor individual metadata). Detailed behaviors (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) of all units of a specific Fungible Resource is defined by its Fungible Resource Manager. This includes the rules for: - the Resource’s maximum divisibility, - minting and burning the Resource’s units, - freezing and recalling the Resource from Vaults, - tracking the Resource’s total supply. Many functionalities of a Fungible Resource are implemented by Fungible Vault, Fungible Bucket and Fungible Proof, which are separate Native Blueprints (covered in detail by their respective documentation pages). The sections below focus on the FungibleResourceManager Blueprint itself. Optional Features Not every Fungible Resource needs to satisfy the same set of use-cases. For this reason, the set of blueprint Features which can be optionally present on a FungibleResourceManager instance is quite broad: - track_total_supply - whether the total supply of the Resource should be tracked, - vault_freeze - whether a Vault holding the Resource can ever be frozen, - vault_recall - whether the Resource can ever be recalled from a Vault, - mint - whether more of the Resource (above initial supply) can ever be minted, - burn - whether the Resource can ever be burned. On-ledger State The FungibleResourceManager blueprint defines two fields: - divisibility, holding an integer in the inclusive range [0, 18], denoting a number of decimal places that the Resource can be split into. In other words: a precision of fixed-point fractional operations performed on the Resource’s amounts. - total_supply (only present if the track_total_supply Feature is enabled), holds an automatically-maintained amount of all units in circulation at any moment (i.e. minted so far and not burned yet). API Reference Functions create Defines a new fungible Resource (i.e. creates its Manager). Name create Type Function Callable By Public Arguments owner_role - OwnerRole (https://github.com/radixdlt/radixdlt-scrypto/blob/01c421e4a5583f3c191f685fc322c1524600a911/radix-engine-interface/src/blueprints/resource/role_assignment.rs#L215) : The owner’s access rule (possibly None). track_total_supply - bool: Whether to enable the supply-tracking feature (validator-1#optional-features) . divisibility - u8: The Resource unit’s divisibility (see its definition (validator-1#onledger-state) ). resource_roles - FungibleResourceRoles (https://github.com/radixdlt/radixdlt-scrypto/blob/01c421e4a5583f3c191f685fc322c1524600a911/radix-engine-interface/src/blueprints/resource/fungible/fungible_resource_manager.rs#L16) : The set of rules for all roles, including the optional ones (which cause their respective features (validator-1#optional-features) to be enabled). metadata - ModuleConfig: Configuration of metadata roles and the initial metadata values. address_reservation - Option: An optional reservation of the global address. Returns ResourceAddress: A de-facto identifier of the newly-created Resource; its Manager’s address. create_with_initial_supply Defines a new fungible Resource (i.e. creates its Manager) in the same way as create (validator-1#create) does, and simultaneously mints the requested amount. This variant is useful e.g. for creating a fixed supply of a non-mintable Resource. Name create_with_initial_supply Type Function Callable By Public Arguments owner_role - OwnerRole (https://github.com/radixdlt/radixdlt-scrypto/blob/01c421e4a5583f3c191f685fc322c1524600a911/radix-engine-interface/src/blueprints/resource/role_assignment.rs#L215) : The owner’s access rule (possibly None). track_total_supply - bool: Whether to enable the supply-tracking feature (validator-1#optional-features) . divisibility - u8: The Resource unit’s divisibility (see its definition (validator-1#onledger-state) ). initial_supply - Decimal: The amount to initially mint. resource_roles - FungibleResourceRoles (https://github.com/radixdlt/radixdlt-scrypto/blob/01c421e4a5583f3c191f685fc322c1524600a911/radix-engine-interface/src/blueprints/resource/fungible/fungible_resource_manager.rs#L16) : The set of rules for all roles, including the optional ones (which cause their respective features (validator-1#optional-features) to be enabled). metadata - ModuleConfig: Configuration of metadata roles and the initial metadata values. address_reservation - Option: An optional reservation of the global address. Returns A tuple containing: ResourceAddress: A de-facto identifier of the newly-created Resource; its Manager’s address. FungibleBucket: A bucket with the entire initially-minted supply of the Resource. Methods mint Creates the requested amount of the Resource. Note: the mint feature (validator-1#optional-features) must be enabled on the Resource Manager. Name mint Type Method Callable By Public - requires the minter role. Arguments amount - Decimal: An amount to mint. Returns FungibleBucket: A bucket with the newly-minted amount. burn Destroys the given bucket of the Resource. Note: the burn feature (validator-1#optional-features) must be enabled on the Resource Manager. Name burn Type Method Callable By Public - requires the burner role. Arguments bucket - FungibleBucket: A bucket with the amount to burn. Returns Nothing package_burn Destroys the given bucket of the Resource, in the same way as burn does. This is an internal method needed to allow burning Resources from a Vault. Note: the burn feature (validator-1#optional-features) must be enabled on the Resource Manager. Name package_burn Type Method Callable By Own package only. Arguments bucket - FungibleBucket: A bucket with the amount to burn. Returns Nothing create_empty_vault Creates a new empty Vault tailored for storing the managed Resource and supporting the features configured on this Resource Manager. Name create_empty_vault Type Method Callable By Public Arguments None Returns Own: The address of the created Vault. create_empty_bucket Creates a new empty Fungible Bucket tailored for holding the managed Resource. Name create_empty_bucket Type Method Callable By Public Arguments None Returns FungibleBucket: The created Bucket. get_resource_type Queries the managed Resource’s type. Name get_resource_type Type Method Callable By Public Arguments None Returns ResourceType: For a fungible Resource Manager this will always be ResourceType::Fungible, containing the Resource’s configured (validator-1#onledger-state) divisibility. get_total_supply Queries the total amount of all units of this Resource currently in circulation. Note: the track_total_supply feature (validator-1#optional-features) must be enabled on the Resource Manager. Name get_total_supply Type Method Callable By Public Arguments None Returns Decimal: The total supply amount. amount_for_withdrawal Converts the input amount to an actual withdrawal amount, taking the Resource’s configured (validator-1#onledger-state) divisibility and the given rounding mode into account. Name amount_for_withdrawal Type Method Callable By Public Arguments request_amount - Decimal: The approximate, requested amount. withdraw_strategy - WithdrawStrategy: - either Exact, which returns the request_amount unchanged, - or one of the RoundingMode (https://github.com/radixdlt/radixdlt-scrypto/blob/01c421e4a5583f3c191f685fc322c1524600a911/radix-engine-common/src/math/rounding_mode.rs#L16) s to be applied. Returns Decimal: The actual, withdrawable amount. drop_empty_bucket Drops the given empty Fungible Bucket. Note: passing a non-empty Bucket will result in an error. Name drop_empty_bucket Type Method Callable By Public Arguments bucket - FungibleBucket: The bucket to drop. Returns Nothing Events A FungibleResourceManager component instance can be a source of the following events: VaultCreationEvent { // The ID of the created Vault. vault_id: NodeId } Emitted when a new Vault for the managed Resource is created (i.e. on create_empty_vault API call). MintFungibleResourceEvent { // The minted amount. amount: Decimal, } Emitted when some amount of the managed Resource is minted (i.e. on the explicit mint API call, but also on create_with_initial_supply). BurnFungibleResourceEvent { // The burned amount. amount: Decimal } Emitted when some amount of the managed Resource is burned (i.e. on the explicit burn/package_burn API calls, but also internally when burning the transaction fees). ## Native Blueprints URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/native-blueprints/native-blueprints Updated: 2026-02-18 Summary: Certain blueprints are so critical, or so universally useful, that they are implemented directly in Rust in the Radix Engine. Like any blueprint, these can be i Reference > Radix Engine > Native Blueprints — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/README.md) Certain blueprints are so critical, or so universally useful, that they are implemented directly in Rust in the Radix Engine.  Like any blueprint, these can be instantiated into components.  They contain no royalties, and are free for anyone to use. Instantiating a native component is usually done through a dedicated manifest instruction (specifications) , though this is not always the case. Because native components do not require any WASM interpretation, they are highly performant and have minimal fee impact. Adding new native blueprints (or adding new features to existing native blueprints) can only occur as part of a coordinated protocol update. ## Authorization Model URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/authorization-model Updated: 2026-02-18 Summary: The implementation of is discussed in the developer-focused documentation here. Reference > Radix Engine > Authorization Model — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/authorization-model.md) Further Information The implementation of is discussed in the developer-focused documentation here (https://radix-engine-docs.radixdlt.com//native/auth/system_module.html) . ## Engine Tech Docs URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/engine-tech-docs Updated: 2026-02-18 Summary: Developer-focused documentation on the design and implementation of the Babylon Radix Engine is available at Reference > Radix Engine > Engine Tech Docs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/engine-tech-docs.md) Developer-focused documentation on the design and implementation of the Babylon Radix Engine is available at https://radix-engine-docs.radixdlt.com/ (https://radix-engine-docs.radixdlt.com/) . ## Radix Engine URL: https://radix.wiki/developers/legacy-docs/reference/radix-engine/radix-engine Updated: 2026-02-18 Summary: Legacy documentation: Radix Engine Reference > Radix Engine — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/README.md) ## DApps, Dashboards and Wallets URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-dapps-dashboards-and-wallets Updated: 2026-02-18 Summary: Not so relevant to exchange integrations - but for completeness, it’s useful to briefly explain how users will interact with the Radix ecosystem. Reference > Integrator Concepts > DApps, Dashboards and Wallets — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-dapps-dashboards-and-wallets.md) Not so relevant to exchange integrations - but for completeness, it’s useful to briefly explain how users will interact with the Radix ecosystem. - Users will use a mobile wallet. The mobile wallet is designed to feel intuitive for both a mainstream user and a power user, and includes features such as smart accounts (multi-factor + recovery), as well as clear understanding of what’s in a transaction, and transaction preview - made possible by the power of the transaction manifest and design of the Radix network. - Users will connect to dApps by running an extension in their browser, and use the Radix connect button in a DApp in order to connect with their wallet. A demonstration of wallet features is set out in the RadFi keynote (https://www.radixdlt.com/radfi) . ## Network Upgrades once live URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-network-upgrades-once-live Updated: 2026-02-18 Summary: There will be occasional network upgrades, called “Protocol Updates” (sometimes called “forks” on other chains) which will require upgrading your Babylon node s Reference > Integrator Concepts > Network Upgrades once live — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-network-upgrades-once-live.md) There will be occasional network upgrades, called “Protocol Updates” (sometimes called “forks” on other chains) which will require upgrading your Babylon node software. If you stick to using the LTS supported endpoints for your integrations, these upgrades will not be expected to result in significant changes to your integration - these will not be large upgrades like the Olympia to Babylon migration. The move to a sharded ledger at Xi’an may require minor changes to your integration, but this is some distance in the future. ## Key developer links and version URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-key-developer-links-and-version Updated: 2026-02-18 Summary: Legacy documentation: Key developer links and version Reference > Integrator Concepts > Key developer links and version — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-key-developer-links-and-version.md) Details are below: - Node release: https://github.com/radixdlt/babylon-node/releases/ (https://github.com/radixdlt/babylon-node/releases/) - Docs: https://docs.radixdlt.com/ (https://docs.radixdlt.com/) - Typescript Radix Engine Toolkit version: https://www.npmjs.com/package/@radixdlt/radix-engine-toolkit?activeTab=versions (https://www.npmjs.com/package/@radixdlt/radix-engine-toolkit?activeTab=versions) - Core API SDK version: https://www.npmjs.com/package/@radixdlt/babylon-core-api-sdk?activeTab=versions (https://www.npmjs.com/package/@radixdlt/babylon-core-api-sdk?activeTab=versions) - Core API documentation: https://radix-babylon-core-api.redoc.ly/ (https://radix-babylon-core-api.redoc.ly/) ## Getting test XRD and paying fees from the Faucet URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-getting-test-xrd-and-paying-fees-from-the-faucet Updated: 2026-02-18 Summary: The faucet is available on test networks to get XRD. There are a number of ways to use the faucet: Reference > Integrator Concepts > Getting test XRD and paying fees from the Faucet — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-getting-test-xrd-and-paying-fees-from-the-faucet.md) The faucet is available on test networks to get XRD. There are a number of ways to use the faucet: - In the mobile wallet, after you’ve created an account, you can click the … menu and request test XRD. - The LTS Toolkit has a function for building a transaction to deposit XRD from a faucet into an account. - In a transaction manifest, you can use the following to lock fee or withdraw money from the faucet - the faucet address can be found in the Test Networks section: # Pay fees from the faucet: CALL_METHOD Address("") "lock_fee" Decimal("10"); # Get 1000 XRD onto your worktop, put it in a bucket, and deposit it into your account: CALL_METHOD Address("") "free"; CALL_METHOD Address("") "deposit_batch" Expression("ENTIRE_WORKTOP"); ## Transactions for Integrators URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-transactions Updated: 2026-02-18 Summary: This article explains how to interact with transactions as integrators. If you're looking for a more complete overview of the structure, design and motivation o Reference > Integrator Concepts > Transactions for Integrators — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-transactions.md) This article explains how to interact with transactions as integrators. If you're looking for a more complete overview of the structure, design and motivation of the Radix transaction model, see the transaction overview (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-overview.md) . Construction process For most users, the transaction construction process will be handled in their mobile Radix Wallet. Integrators wishing to construct transactions programmatically will need to integrate the Radix Engine Toolkit (RET) into their application for construction and finalization. The RET has a Rust-native core, but a wrapper is provided in a number of different languages to make integration easier. Transaction versions There are two versions of transactions supported on Radix: - Transaction V1 launched at Babylon in September 2023. - Transaction V2 launched at Cuttlefish in December 2024. It has added support for subintents, tips in basis points, and timestamp-based expiry. As a programmatic integrator, you can use either. Much tooling (including the Typescript LTS toolkit) has better support for Transaction V1, and the rest of the guide assumes you're using Transaction V1. If your tooling supports building a Transaction V2, then it is easy to use instead of V1, and new functionality is opt-in. Transaction contents, hashes and identifiers User transactions are formed of a core transaction "intent", which is then signed by 0+ signatories, before being notarized. The output is called a notarized transaction. It is this notarized transaction payload which is submitted to the network. More specifically, you can think of a transaction as a shell. The innermost layer is the Transaction Intent - which is the body of the transaction. It includes: - A “header” with data such as the epoch window in which the transaction is valid, a message, and a nonce to allow for creation of duplicate intents. It also includes the notary public key which needs to sign the transaction before submission, and a flag which marks that the notary should count as a signer. - A “ manifest (transaction-manifest) ” which contains human-readable instructions for the transaction. - The LTS Toolkit has an easy builder to help you create a manifest for fungible transfers without learning about the manifest. - Optionally, “blobs” - payloads which can be referenced from the manifest. The RET can take this intent and create a summary of it: the “(transaction) intent hash”. The Radix Engine guarantees that each intent can be committed no more than once. The transaction identifier or “txid” is the Bech32m encoded intent hash (sometimes called intent_hash_bech32m in the API). This is the identifier which should be shown to users, and can be linked to the dashboard, for example, an example of a mainnet transaction id, with explorer link is: txid_rdx1763t9r3pq962lje83dkdhv4wkjpe92z5sqee56rp0l5k26lzrzjs0q5ugq (https://dashboard.radixdlt.com/transaction/txid_rdx1763t9r3pq962lje83dkdhv4wkjpe92z5sqee56rp0l5k26lzrzjs0q5ugq) . The next layer is the Signed Transaction. The intent hash can then be signed by 0 or more signatories, which is combined with the intent to form a signed transaction, which is summarized in a “signed transaction hash”. The final layer is the Notarized Transaction. The signed transaction hash is signed by the notary, and combined into the notarized transaction, which is compiled and then submitted to the ledger. A notarized transaction can be summarized by a “notarized transaction hash”, also referred to as “(notarized) payload hash”. Uniqueness of transactions Whilst the engine guarantees that an intent hash is only committed once, it is technically possible for a notary to sign multiple different, valid payloads with the same intent hash. Whilst this would be abnormal behavior, it still needs to be handled by the node. Therefore endpoints which return details about an uncommitted intent (such as transaction status) will also return details for each payload that the node is aware of which contain this intent. Typically this will return only the one payload you’ve submitted. Transaction notary The notary is responsible for ensuring that the transaction has been signed by the right signers, before submitting it to the network. In a later release, the notary will also be able to revoke pending submitted transactions which have not yet been committed (with a new, “cancel” transaction). For typical single-signer transactions, the transaction header can be configured to include the notary as a signer, and the notary can simply be the main signatory, with 0 additional signatures required. This is cheaper than having a separate notary. For more complex multi-signer transactions, the intended submitter would be the notary. Either this would be one of the signers, or it could be some third party orchestrator. The notary only has the power to select the signatures and to submit or cancel the transaction. The engine has been designed to ensure that if an intent can be successfully submitted, its outcome is independent of which signatures were present. This means that even if a notary changes the signatures, it can only cause the transaction to fail (which rollback the result - minus fees), but can’t affect its result when it succeeds. Therefore it is safe to nominate third-parties (e.g. transaction orchestrators) as notaries, so long as you are happy with giving them the power to cause a transaction to fail or be cancelled if they include the wrong signatures or choose to cancel it. Transaction outcome Once submitted to a node, a transaction payload can end up being either rejected or committed. The transaction status endpoint can be used to query the current status of a submitted transaction intent. Transactions get rejected if they fail to pass certain criteria at the given time. A transaction payload can be marked as a: - Permanent Rejection if it is never possible for it to be committed (eg it's statically invalid, or only valid up until epoch 100 and it's now epoch 101) - Temporary Rejection if it still may be possible that the transaction payload could be committed. A given intent typically is only part of one submitted notarized payload, but it's possible for a notary to notarize and submit multiple payloads for the same intent. The Radix Engine ensures that any intent can only be committed once. A committed transaction is either committed with an outcome of "Success" or "Failure": - Committed Failure will result in fees being paid up until the failure point, but all other events and state changes will be discarded. - Committed Success will result in all changes being committed, and fees being paid. Only committed transactions appear in the transaction stream - rejected transactions by definition never make it into the history of the ledger. Typically you will want to handle these in the following ways: - Temporary Rejection: You may wish to wait or resubmit the same transaction (with the same transaction intent / transaction identifier). - Do NOT rebuild the transaction - if you submit a newly built/signed transaction, both transactions could be committed. - Be careful: the transaction may still be able to be committed successfully! For example - if not enough XRD is available to lock a fee in the account, the transaction will be marked as a temporary rejection. Because if the account is topped up, the transaction might still go through. - Eventually this transaction will be permanently rejected because its “max epoch” that was configured during transaction construction will have passed. - You may wish to tune the max epoch so that transactions permanently reject sooner. Each epoch lasts about 5 minutes. - Permanent Rejection or Committed Failure: The transaction at this stage cannot be committed successfully. You will need to remedy the reason for failure - EG the account doesn’t have enough XRD to pay for fees - and then build/sign a new, replacement transaction - which will have a new transaction identifier. To summarise: - You are always safe to resubmit the same transaction (with the same transaction intent / transaction identifier) - each transaction intent can only be committed once. - To prevent the risk of a duplicate commit, you should only rebuild and submit a replacement transaction if you’ve seen that the previous transaction was marked as either Committed Failure or Permanent Rejection. Transaction results A transaction results in various outputs, notably: - State updates to the current ledger state, including changes to resource balances. - Resource balances live in vaults under accounts/components. - Transaction outcomes returned by the LTS API automatically aggregates these balance changes under global accounts/components for you, to avoid you having to worry about separate vaults. - Emitted events The LTS sub-api of the Core API lets you query the balance changes in a transaction. There is no such thing as a user “transaction type” such as a “transfer” - all user transactions make use of a transaction manifest, and could - eg - call DeFi components. Instead - we encourage you to think about a transaction’s resulting balance changes. For example: - Any transaction which resulted in your account gaining balance should be interpreted as a deposit into your account. - A transaction which results in (only) a withdrawal of resource R (and XRD fee payment) from one account and a deposit of that resource into another account could be interpreted as a simple transfer of resource R between those accounts, which could possibly be used to show a special display for the results of the transaction. Transaction handling If a notarized transaction is submitted successfully to a node, the transaction will live in that node’s mempool, and be gossiped around the network. Hopefully it will end up in a validator’s mempool and be included in a proposal, and eventually committed. Commit times are typically a few seconds if the network is uncongested. If a transaction is no longer valid, it will drop out of the node’s mempool and the node will temporarily cache that the transaction is rejected, allowing the rejection to be returned from the transaction status API and preventing it from being added back into its mempool for a time. If a transaction is submitted to a Gateway, it will attempt to resubmit the transaction to the network for a limited time. The transaction status endpoints on the Core API are designed to give a very clear picture about the current status of a transaction intent, and the likelihood that the transaction will be able to be committed. Transaction expiry, nonce and cancellation When transactions are built, in their header we have: - Valid from epoch - from this epoch, the transaction will be valid - Valid before epoch - at this epoch, the transaction will no longer be valid, and instead be permanently rejected - Notary signature Epochs are approximately 5 minutes long. Typically: - The valid from epoch is set to the current epoch - The valid before epoch is set to N epochs above the current epoch - where N is small (eg N = 2 is the default in the LTS Toolkit). If N = 2 then the transaction will permanently reject between approximately 5-10 minutes after construction. The transaction header also contains a nonce, which is to allow creating a new intent on the rare occasions where you wish to duplicate the same intent. This nonce is NOT like an ethereum nonce and repeating the nonce will NOT cancel the previous transaction. In future, we will support transaction cancellation via a special transaction that the notary can sign. But this will not be needed for launch as we expect commit times to be short until the network is saturated. Transaction messages Warning Encrypted messages are not currently implemented in the official Radix wallet. These messages can either be: - UTF-8 string - Raw bytes - Encrypted UTF-8 string - Encrypted raw bytes Messages have no length limit besides transaction size - although large transactions will cost more. Encrypted messages can be encrypted for reading by multiple Ed25519 and/or Secp256k1 public keys. These keys may either be: - Included by the dApp in a wallet transaction request, or resolved from the metadata of a receiving account. - Resolved by the sender’s wallet from the target account’s metadata for transfer transactions built by the wallet. Note - virtual account addresses only contain hashed public keys, which isn’t sufficient for encrypting messages. Instead an account owner will upload a public key as metadata against their account if they wish to receive encrypted messages. Transaction stream and state versions Committed transactions are assigned a “resultant state version” which starts at 1 for the first transaction on the Babylon ledger and effectively acts as an auto-incrementing primary key for the Babylon committed transaction stream (with no gaps). The Core and Gateway APIs include endpoints starting /stream - these endpoints operate over the ordered stream of committed transactions, by state version ascending - and let you query for ledger history, transaction-by-transaction. Note: Babylon state versions start from 1 again - so the Babylon transaction with state version 1321 is different from the Olympia transaction with state version 1321. These transaction histories should be stored separately. Transaction fees Fees are paid in XRD. XRD is stored in vaults, and during execution, a transaction must call “lock fee” against an XRD vault to reserve some XRD to pay the transaction fee. Transfer transactions constructed by the LTS Toolkit SimpleTransactionBuilder will include a lock fee instruction automatically against the sender’s account, so the sender’s account must contain some XRD before they can send transactions. Multiple vaults can lock a fee in a transaction - with the later vaults being used preferentially. This can allow dApp components to pay fees for the user - or for other accounts to pay fees on behalf of a different account (which would need a transaction with multiple signatures). The transaction is granted a mini “loan” at the start of the transaction, during which it must lock a fee from an XRD vault. A transaction which doesn’t repay its loan in time will be rejected. If a transaction runs out of locked fee after the loan has been repaid, the transaction is marked as a committed failure. All changes (apart from fee payments) are rolled back. The fee total comes from a number of places: - Execution cost: - Engine calls and WASM execution - Reading and writing state. - This includes instantiation of virtual accounts which are created for the first time in your transaction via a deposit. This means that a transaction which deposits to a new virtual account will pay a higher fee than a transaction which deposits to an existing account. - Signature verification - And many other places - Royalty cost: - Packages and Components can define royalties, which are paid to the package / component owner for using their code or component. - This enables package writers to get paid for their work, and allows components such as oracles to charge for providing a service. - Tips: - A transaction may provide a tip multiplier - which can be used by validators to prioritise transactions if there is network contention. - The tip is defined in the transaction’s header. For a full blog post on fees, see How Fees Work in Babylon (https://www.radixdlt.com/blog/how-fees-work-in-babylon) . ## Native Token - XRD URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-native-token-xrd Updated: 2026-02-18 Summary: The “Radix” token is called “XRD” and is the native token of the Radix ledger. It behaves like any other resource on the network. Reference > Integrator Concepts > Native Token - XRD — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-native-token-xrd.md) The “Radix” token is called “XRD” and is the native token of the Radix ledger. It behaves like any other resource on the network. It is a fungible resource and there are 1018 “attos” in 1 XRD. In other words, it has the default/maximum divisibility for a fungible resource of 18: one subunit is 10-18 units. XRD is used to pay transaction fees, and for staking to validators for delegated-proof-of-stake security for consensus. ## State Model - Advanced URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-state-model-advanced Updated: 2026-02-18 Summary: This section describes the state model in some detail. This detail may not be necessary for all use cases. Reference > Integrator Concepts > State Model - Advanced — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-state-model-advanced.md) This section describes the state model in some detail. This detail may not be necessary for all use cases. The Babylon state model consists of a forest of state trees. - At the top of each tree is a global entity - such as an account, resource or package. - Entities contain modules - for example “Self” for their own state, and “Authorization” for their access rules. - These modules contain substates which actually store the state. - … And these substates can then own further internal entities, allowing for layers of recursion. For example: - An Account entity has a “Self” module which contains its “component data” substate. - This component data substate owns an internal key value store entity - which has a self module containing entry substates for each resource owned by the account. - These entry substates each then own a single internal vault entity for that resource which actually stores its relevant resource. In general, only global entities can be addressed directly, and internal state is mostly an implementation detail - although the state tree can be queried via the APIs. ## State Model - Introduction URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-state-model-introduction Updated: 2026-02-18 Summary: Ledger data is stored in small versioned chunks called substates. Substates contain programmatic information which is encoded in a custom encoding called “SBOR” Reference > Integrator Concepts > State Model - Introduction — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-state-model-introduction.md) Ledger data is stored in small versioned chunks called substates. Substates contain programmatic information which is encoded in a custom encoding called “SBOR” (Scrypto binary object representation). This can be converted to/from JSON with the Radix Engine Toolkit. Substates are stored under objects called entities. Entities have addresses - for example, an Account has an address starting account_rdx1___. Entities which can be accessed directly are called global entities. Some entities are owned by other entities - these entities are called internal, and their addresses always start with the prefix internal_. Some important entities are as follows: Resources / Resource Managers Resource Managers are definitions of “resources” - which are the Radix ledger’s representation of assets. All resources have a global entity address starting resource_. Resources can be “fungible” or “non-fungible”. Fungible resources have a divisibility which by default is in units of 10-18 but can be 10-n for some other integer 0 <= n <= 18. Resource quantities are represented using Decimal numbers (fixed point), unlike Olympia and most other networks where whole numbers are always used. That is, if an account has 11.5 XRD in it, it will be seen as 11.5 XRD in the API, not as 11500000000000000000 XRD subunits. Non-fungible resources have a non-fungible id type - one of: - Number - displayed as #123# - String - displayed as - Bytes - displayed in hex as [deadbeef] - UUID - displayed as {b36f5b3f-835b-406c-980f-7788d8f13c1b} Non-fungible resources also have a schema for data which is attached against each non-fungible id. Vaults Vaults store resources on ledger. They are always owned by another entity. They have an internal entity address starting internal_vault_. Accounts Accounts are used by users to store resources - and expose methods that allow resources to be deposited to them. Global accounts have an address starting account_. Whilst accounts can be created like any other component, most accounts start as a virtual account, associated with a public key, but not yet created on ledger. When a deposit is made (or they are otherwise first interacted with), a small fee is paid and the account is instantiated, with the public key as the controller of the account. Note that accounts can have their ownership rules updated by their owner. So even virtual accounts which start being controlled by the corresponding key pair may be changed to no longer be controlled by that key pair. In particular, accounts can turn into smart accounts ( see RadFi (https://www.radixdlt.com/radfi) ), by turning them into “account-owner-badge-controlled” and depositing that badge into an access controller - this allows complex multi-factor controls, and allows for the keys to be changed. Most accounts created by users will eventually end up as smart accounts. Packages and Components DApps have their code deployed in Packages. These packages can be instantiated as Components, and can be interacted with in the manifest just like accounts. All packages have a global entity address starting package_. In fact, Accounts actually are components - although they use a “native” package, built into the Radix Engine. ## Curves, Keys, Signatures and Hashing URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-curves-keys-signatures-and-hashing Updated: 2026-02-18 Summary: The Babylon Radix network supports ECDSA Secp256k1 and Ed25519 for accounts and transaction signing. Reference > Integrator Concepts > Curves, Keys, Signatures and Hashing — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-curves-keys-signatures-and-hashing.md) The Babylon Radix network supports ECDSA Secp256k1 and Ed25519 for accounts and transaction signing. The Babylon Radix network uses Blake2b-256 as its primary hashing mechanism - but integrators should not need to use Blake2b themselves for transaction signing. Instead, the (offline) Radix Engine Toolkit provides you with the hash to sign. If using Secp256k1: - Signatures should be serialized as recoverable signatures of 65 bytes, with the recovery byte first, as: v || r || s - There isn’t a de-facto convention for serialization of compact Secp256k1 signatures. - On Olympia, DER/ASN.1 was used - the above format for Babylon is different - and more compact. - Note that some libraries (such as libsecp256k1) have their own compact serialization and a few serialize it as reverse(r) || reverse(s) || v - an example of the conversion of this format into the Radix format is here (https://gist.github.com/0xOmarA/01184ff98b155254392d277d753932ff) and we have some test vectors here for optionally testing your serialization. - The public key is encoded as the standard 33-byte encoding for compressed Secp256k1 public keys (X coordinate and the sign byte) If using Ed25519: - Note that the message you sign will be the relevant transaction hash. As part of signing, Ed25519 will first use SHA-512 on this hash before signing with Curve25519. Typically this will happen implicitly, but some implementations require you using SHA-512 manually first. - The signature is encoded as the standard 64-byte encoding for Ed25519 signatures. - The public key is encoded as the standard 32-byte encoding for Ed25519 public keys. We have some test vectors here for verification (https://gist.github.com/0xOmarA/afcf19a09cb400d26cf11dafc03d1c53) of public key and signature serialization. ## Consensus, Ledger Forks, Blocks, and Trust chains URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-consensus-ledger-forks-blocks-and-trust-chains Updated: 2026-02-18 Summary: The Babylon Radix Network uses HotStuff BFT - with a decentralized validator set. This process has deterministic finality: commits are final and there are no pr Reference > Integrator Concepts > Consensus, Ledger Forks, Blocks, and Trust chains — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-consensus-ledger-forks-blocks-and-trust-chains.md) The Babylon Radix Network uses HotStuff BFT - with a decentralized validator set. This process has deterministic finality: commits are final and there are no probabilistic forks in the ledger. Unlike most blockchains, the transaction stream is not separated into distinct blocks. Rather, the Babylon Radix ledger is a chain of transactions, broken into epochs. Each epoch lasts approximately 5 minutes. Transactions are given a “(resultant) state version” which starts at 1 and effectively acts as an auto-incrementing primary key for the committed transaction stream (with no gaps). If for your integration you need a block hash or block index, you should consider using a given transaction’s (resultant) state version instead. Consensus operates on a tree of vertices, but these vertex boundaries effectively disappear once committed. Instead, the ledger is a stream of committed transactions, with signed “proofs” pointing at the last transaction in each vertex. A node does not need to keep all of these proofs - although will definitely keep the proofs marking the end of each epoch. The validator set is fixed for each epoch, but may change at an epoch boundary. Assuming the validator set has always been trusted, by starting from the genesis epoch proof, this chain of epoch proofs can be quickly used to verify who the current validator set is, to verify current ledger proofs. Babylon Radix Network includes a 2-tier Jellyfish Merkle Tree and epoch-based Transaction Payload and Transaction Receipt Merkle-Trees, which can be used to verify the system state and transaction outcomes respectively. ## Infrastructure and APIs URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-infrastructure-and-apis Updated: 2026-02-18 Summary: The Babylon Radix Network is formed of Babylon Nodes, connected into a Network. Reference > Integrator Concepts > Infrastructure and APIs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-infrastructure-and-apis.md) Babylon Nodes The Babylon Radix Network is formed of Babylon Nodes (https://github.com/radixdlt/babylon-node) , connected into a Network. Nodes run as a Java service, wrapping a natively-compiled core, written in Rust. This core includes the Babylon Radix Engine, which runs transactions, and state ledger storage, in Rocks DB. The node’s APIs are: - Core API - exposes transaction and state information from the node. - System API - exposes information about system health and connections. - Prometheus API - exposes metrics from the node in prometheus format. Documentation on Babylon Network APIs can be found here. (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) Exchanges For most exchanges, the LTS sub-section Core API of a node (https://radix-babylon-core-api.redoc.ly/#tag/LTS) should be sufficient, and offers: - Transaction Submission - Transaction Status - Streaming of committed transaction outcomes, including fungible balance changes - Streaming of committed transaction outcomes, filtered to an individual account - Reading of current account balance for 1 or all resources. Exchanges will need to run their own Babylon full node/s. Documentation on running a Babylon node may be found here (https://github.com/gguuttss/radix-docs/blob/master/run/running-infrastructure.md) . Network Gateway Wallets and Dashboards use the Gateway API (../../integrate/network-apis/README.md#gateway-api) , which offers more complex APIs, which allow for more indexing and lookup of historic ledger state. Running a Gateway is more expensive than a Babylon full node. Documentation on running a Network Gateway can be found here (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/README.md) ## Well-known Addresses URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/well-known-addresses Updated: 2026-02-18 Summary: Legacy documentation: Well-known Addresses Reference > Integrator Concepts > Well-known Addresses — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/well-known-addresses.md) Mainnet System Native Addresses { "network_name": "mainnet", "network_id": 1, "network_hrp_suffix": "rdx", "well_known_addresses": { "xrd": "resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd", "secp256k1_signature_virtual_badge": "resource_rdx1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxsecpsg", "ed25519_signature_virtual_badge": "resource_rdx1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxxed25sg", "system_transaction_badge": "resource_rdx1nfxxxxxxxxxxsystxnxxxxxxxxx002683325037xxxxxxxxxsystxn", "package_of_direct_caller_virtual_badge": "resource_rdx1nfxxxxxxxxxxpkcllrxxxxxxxxx003652646977xxxxxxxxxpkcllr", "global_caller_virtual_badge": "resource_rdx1nfxxxxxxxxxxglcllrxxxxxxxxx002350006550xxxxxxxxxglcllr", "package_owner_badge": "resource_rdx1nfxxxxxxxxxxpkgwnrxxxxxxxxx002558553505xxxxxxxxxpkgwnr", "validator_owner_badge": "resource_rdx1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxvdrwnr", "account_owner_badge": "resource_rdx1nfxxxxxxxxxxaccwnrxxxxxxxxx006664022062xxxxxxxxxaccwnr", "identity_owner_badge": "resource_rdx1nfxxxxxxxxxxdntwnrxxxxxxxxx002876444928xxxxxxxxxdntwnr", "package_package": "package_rdx1pkgxxxxxxxxxpackgexxxxxxxxx000726633226xxxxxxxxxpackge", "resource_package": "package_rdx1pkgxxxxxxxxxresrcexxxxxxxxx000538436477xxxxxxxxxresrce", "account_package": "package_rdx1pkgxxxxxxxxxaccntxxxxxxxxxx000929625493xxxxxxxxxaccntx", "identity_package": "package_rdx1pkgxxxxxxxxxdntyxxxxxxxxxxx008560783089xxxxxxxxxdntyxx", "consensus_manager_package": "package_rdx1pkgxxxxxxxxxcnsmgrxxxxxxxxx000746305335xxxxxxxxxcnsmgr", "access_controller_package": "package_rdx1pkgxxxxxxxxxcntrlrxxxxxxxxx000648572295xxxxxxxxxcntrlr", "transaction_processor_package": "package_rdx1pkgxxxxxxxxxtxnpxrxxxxxxxxx002962227406xxxxxxxxxtxnpxr", "metadata_module_package": "package_rdx1pkgxxxxxxxxxmtdataxxxxxxxxx005246577269xxxxxxxxxmtdata", "royalty_module_package": "package_rdx1pkgxxxxxxxxxryaltyxxxxxxxxx003849573396xxxxxxxxxryalty", "role_assignment_module_package": "package_rdx1pkgxxxxxxxxxarulesxxxxxxxxx002304462983xxxxxxxxxarules", "genesis_helper_package": "package_rdx1pkgxxxxxxxxxgenssxxxxxxxxxx004372642773xxxxxxxxxgenssx", "faucet_package": "package_rdx1pkgxxxxxxxxxfaucetxxxxxxxxx000034355863xxxxxxxxxfaucet", "pool_package": "package_rdx1pkgxxxxxxxxxplxxxxxxxxxxxxx020379220524xxxxxxxxxplxxxx", "transaction_tracker_package": "package_rdx1pkgxxxxxxxxxtxtrakxxxxxxxxx000595975309xxxxxxxxxtxtrak", "locker_package": "package_rdx1pkgxxxxxxxxxlckerxxxxxxxxxx000208064247xxxxxxxxxlckerx", "test_utils_package": "package_rdx1phua8spmaxapwq56stduucrvztk92gxzjy9c98h0qemfjec03k9pjp", "consensus_manager": "consensusmanager_rdx1scxxxxxxxxxxcnsmgrxxxxxxxxx000999665565xxxxxxxxxcnsmgr", "genesis_helper": "component_rdx1cptxxxxxxxxxgenssxxxxxxxxxx000977302539xxxxxxxxxgenssx", "faucet": "component_rdx1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxfaucet", "transaction_tracker": "transactiontracker_rdx1stxxxxxxxxxxtxtrakxxxxxxxxx006844685494xxxxxxxxxtxtrak" } } Stokenet System Native Addresses { "network_name": "stokenet", "network_id": 2, "network_hrp_suffix": "tdx_2_", "well_known_addresses": { "xrd": "resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc", "faucet": "component_tdx_2_1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxyulkzl", "secp256k1_signature_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxcdcdpa", "ed25519_signature_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx3e2cpa", "system_transaction_badge": "resource_tdx_2_1nfxxxxxxxxxxsystxnxxxxxxxxx002683325037xxxxxxxxxcss8hx", "package_of_direct_caller_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxpkcllrxxxxxxxxx003652646977xxxxxxxxxfzcnwk", "global_caller_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxglcllrxxxxxxxxx002350006550xxxxxxxxxqtcnwk", "package_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxpkgwnrxxxxxxxxx002558553505xxxxxxxxxfzgzzk", "validator_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxyerzzk", "account_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxaccwnrxxxxxxxxx006664022062xxxxxxxxx4vczzk", "identity_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxdntwnrxxxxxxxxx002876444928xxxxxxxxx98tzzk", "package_package": "package_tdx_2_1pkgxxxxxxxxxpackgexxxxxxxxx000726633226xxxxxxxxxehawfs", "resource_package": "package_tdx_2_1pkgxxxxxxxxxresrcexxxxxxxxx000538436477xxxxxxxxxmn4mes", "account_package": "package_tdx_2_1pkgxxxxxxxxxaccntxxxxxxxxxx000929625493xxxxxxxxx9jat20", "identity_package": "package_tdx_2_1pkgxxxxxxxxxdntyxxxxxxxxxxx008560783089xxxxxxxxx4ewu80", "consensus_manager_package": "package_tdx_2_1pkgxxxxxxxxxcnsmgrxxxxxxxxx000746305335xxxxxxxxxqe4rf2", "access_controller_package": "package_tdx_2_1pkgxxxxxxxxxcntrlrxxxxxxxxx000648572295xxxxxxxxxqewm72", "transaction_processor_package": "package_tdx_2_1pkgxxxxxxxxxtxnpxrxxxxxxxxx002962227406xxxxxxxxxnvke82", "metadata_module_package": "package_tdx_2_1pkgxxxxxxxxxmtdataxxxxxxxxx005246577269xxxxxxxxxrpg925", "royalty_module_package": "package_tdx_2_1pkgxxxxxxxxxryaltyxxxxxxxxx003849573396xxxxxxxxxmwc82d", "role_assignment_module_package": "package_tdx_2_1pkgxxxxxxxxxarulesxxxxxxxxx002304462983xxxxxxxxx9fe8ce", "genesis_helper_package": "package_tdx_2_1pkgxxxxxxxxxgenssxxxxxxxxxx004372642773xxxxxxxxxsnkg30", "faucet_package": "package_tdx_2_1pkgxxxxxxxxxfaucetxxxxxxxxx000034355863xxxxxxxxx3heqcz", "pool_package": "package_tdx_2_1pkgxxxxxxxxxplxxxxxxxxxxxxx020379220524xxxxxxxxxe4r780", "transaction_tracker_package": "package_tdx_2_1pkgxxxxxxxxxtxtrakxxxxxxxxx000595975309xxxxxxxxxnvwmul", "locker_package": "package_tdx_2_1pkgxxxxxxxxxlckerxxxxxxxxxx000208064247xxxxxxxxx8jnpz0", "test_utils_package": "package_tdx_2_1phua8spmaxapwq56stduucrvztk92gxzjy9c98h0qemfjec0fuqeng", "consensus_manager": "consensusmanager_tdx_2_1scxxxxxxxxxxcnsmgrxxxxxxxxx000999665565xxxxxxxxxv6cg29", "genesis_helper": "component_tdx_2_1cptxxxxxxxxxgenssxxxxxxxxxx000977302539xxxxxxxxx9cs7tj", "transaction_tracker": "transactiontracker_tdx_2_1stxxxxxxxxxxtxtrakxxxxxxxxx006844685494xxxxxxxxxxzw7jp" } } resim Radix Engine Simulator { "network": "simulator", "network_id": 242, "network_hrp_suffix": "sim", "well_known_addresses": { "xrd": "resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3", "faucet": "component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh", "secp256k1_signature_virtual_badge": "resource_sim1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxwj8qq5", "ed25519_signature_virtual_badge": "resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5", "system_transaction_badge": "resource_sim1nfxxxxxxxxxxsystxnxxxxxxxxx002683325037xxxxxxxxxw002k0", "package_of_direct_caller_virtual_badge": "resource_sim1nfxxxxxxxxxxpkcllrxxxxxxxxx003652646977xxxxxxxxxla870l", "global_caller_virtual_badge": "resource_sim1nfxxxxxxxxxxglcllrxxxxxxxxx002350006550xxxxxxxxxk5870l", "package_owner_badge": "resource_sim1nfxxxxxxxxxxpkgwnrxxxxxxxxx002558553505xxxxxxxxxlah0rl", "validator_owner_badge": "resource_sim1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxjxu0rl", "account_owner_badge": "resource_sim1nfxxxxxxxxxxaccwnrxxxxxxxxx006664022062xxxxxxxxxrn80rl", "identity_owner_badge": "resource_sim1nfxxxxxxxxxxdntwnrxxxxxxxxx002876444928xxxxxxxxxnc50rl", "package_package": "package_sim1pkgxxxxxxxxxpackgexxxxxxxxx000726633226xxxxxxxxxlk8hc9", "resource_package": "package_sim1pkgxxxxxxxxxresrcexxxxxxxxx000538436477xxxxxxxxxaj0zg9", "account_package": "package_sim1pkgxxxxxxxxxaccntxxxxxxxxxx000929625493xxxxxxxxxrn8jm6", "identity_package": "package_sim1pkgxxxxxxxxxdntyxxxxxxxxxxx008560783089xxxxxxxxxnc59k6", "consensus_manager_package": "package_sim1pkgxxxxxxxxxcnsmgrxxxxxxxxx000746305335xxxxxxxxxxc06cl", "access_controller_package": "package_sim1pkgxxxxxxxxxcntrlrxxxxxxxxx000648572295xxxxxxxxxxc5z0l", "transaction_processor_package": "package_sim1pkgxxxxxxxxxtxnpxrxxxxxxxxx002962227406xxxxxxxxx4dvqkl", "metadata_module_package": "package_sim1pkgxxxxxxxxxmtdataxxxxxxxxx005246577269xxxxxxxxx9qjump", "royalty_module_package": "package_sim1pkgxxxxxxxxxryaltyxxxxxxxxx003849573396xxxxxxxxxa0z7mc", "role_assignment_module_package": "package_sim1pkgxxxxxxxxxarulesxxxxxxxxx002304462983xxxxxxxxxrgr7fv", "genesis_helper_package": "package_sim1pkgxxxxxxxxxgenssxxxxxxxxxx004372642773xxxxxxxxxkjv3q6", "faucet_package": "package_sim1pkgxxxxxxxxxfaucetxxxxxxxxx000034355863xxxxxxxxxhkrefh", "pool_package": "package_sim1pkgxxxxxxxxxplxxxxxxxxxxxxx020379220524xxxxxxxxxl5e8k6", "transaction_tracker_package": "package_sim1pkgxxxxxxxxxtxtrakxxxxxxxxx000595975309xxxxxxxxx4d5zd2", "locker_package": "package_sim1pkgxxxxxxxxxlckerxxxxxxxxxx000208064247xxxxxxxxxpnfcn6", "test_utils_package": "package_sim1phua8spmaxapwq56stduucrvztk92gxzjy9c98h0qemfjec00a6qza", "consensus_manager": "consensusmanager_sim1scxxxxxxxxxxcnsmgrxxxxxxxxx000999665565xxxxxxxxxxc06cl", "genesis_helper": "component_sim1cptxxxxxxxxxgenssxxxxxxxxxx000977302539xxxxxxxxxkjv3q6", "transaction_tracker": "transactiontracker_sim1stxxxxxxxxxxtxtrakxxxxxxxxx006844685494xxxxxxxxx4d5zd2", } } ## Environments URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-environments Updated: 2026-02-18 Summary: There are three main Radix environments. Each environment has a different network ID and each has a recognizable address format. Reference > Integrator Concepts > Environments — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-environments.md) There are three main Radix environments. Each environment has a different network ID and each has a recognizable address format. Mainnet Mainnet is the official Radix Public Network. It's Production. It has real assets, and its history goes back to the Radix Olympia Public Network which went live in July of 2021. Mainnet's network ID is 1, and addresses on Mainnet have _rdx immediately after the human-readable address type. E.g., account addresses on mainnet start with account_rdx. Mainnet Item Value Network Id 1 Logical Name mainnet Dashboard / Explorer https://dashboard.radixdlt.com/ (https://stokenet-dashboard.radixdlt.com/) Developer Console https://console.radixdlt.com/ (https://console.radixdlt.com/) Gateway https://mainnet.radixdlt.com/swagger/ (https://babylon-stokenet-gateway.radixdlt.com/swagger) Native addresses - XRD: resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd - More are detailed in Well-known Addresses (well-known-addresses.md#mainnet-system-native-addresses) . Stokenet (Public Test Network) Stokenet is the primary test network for doing Radix development. It typically runs the exact same protocol version as mainnet, though sometimes it also serves as a first deployment for planned releases. Stokenet is intended to be stable, but in exceptional circumstances may be wiped and reset. Anything deployed to Stokenet should be treated as for testing purposes only. Stokenet's network ID is 2, and addresses on Stokenet have _tdx_2_ immediately after the human-readable address type. E.g., resource addresses on Stokenet start with resource_tdx_2_. Stokenet is our primary public test network, intended for integration and testing work. Stokenet Item Value Network Id 2 Logical Name stokenet Dashboard / Explorer https://stokenet-dashboard.radixdlt.com/ (https://stokenet-dashboard.radixdlt.com/) Developer Console https://stokenet-console.radixdlt.com/ (https://stokenet-console.radixdlt.com/) Gateway https://stokenet.radixdlt.com/swagger/ (https://babylon-stokenet-gateway.radixdlt.com/swagger) Docker compose node for Core API development https://github.com/radixdlt/babylon-node/tree/main/testnet-node (https://github.com/radixdlt/babylon-node/tree/main/testnet-node) Native addresses - XRD: resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc - Faucet: component_tdx_2_1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxyulkzl - More are detailed in Well-known Addresses (well-known-addresses.md#stokenet-system-native-addresses) Local Environment When using the Radix Engine Simulator (resim), all deployment and interactions take place on your local filesystem and local machine. You can reset your local environment at will. The simulator's network ID is 242. Addresses on the simulator have _sim immediately after the human-readable address type. E.g., package addresses on the simulator start with package_sim. resim Item Value Native addresses - XRD: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 - Faucet: component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh - More are detailed in Well-known Addresses (well-known-addresses.md#resim-system-native-addresses) ## Addresses URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/concepts-addresses Updated: 2026-02-18 Summary: Radix Engine addresses are Bech32m encoded, where they are made up of a Human Readable Part (HRP), separator, and a base32m encoded da Reference > Integrator Concepts > Addresses — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-addresses.md) High Level Overview Radix Engine addresses are Bech32m (https://github.com/bitcoin/bips/blob/master/bip-0350.mediawiki) encoded, where they are made up of a Human Readable Part (HRP), separator, and a base32m encoded data part which includes a 6 character checksum of the HRP and data. The human readable part is made up of two specifiers that are separated by an underscore ("_"). These are: - Entity Specifier: Specifies the type of entity that this address is for. As an example, in the case of the address being used to address an account component, the entity specifier would be "account". This makes it clear at first glance what the address is meant to be used for. - Network Specifier: Specifies the network that the address is used for. This helps distinguish mainnet address from stokenet address, and so on; making it clear what network the address is use for. The use of the Bech32m addressing scheme comes with a number of benefits: - All addresses are now checksummed and therefore typographical errors can be detected and corrected. - The entity specifier makes it clear what the address is meant to represent. The Radix Engine Toolkit can generate virtual account addresses from a Ed25519 or Secp256k1 public key (../../integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-derivation.md#virtual-account-address-from-a-public-key) , removing the need to implement Bech32m encoding/decoding. Encoded Data The data encoded in the Bech32m address is the byte representation of the address, which is an array of 30 bytes that is made up of two main parts: - Entity Byte: This is the first byte in the address and it defines the type of entity being addressed. - Address Bytes: These are the remaining 29 bytes in the array and they are the address of the entity. The supported entity types and their entity byte representation are given in entity_type.rs (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-common/src/types/entity_type.rs) . The byte-representation of addresses is lower-level concept and is not something that you will need to use on day-to-day basis. Entity Specifiers Entity specifiers are the first part of the Bech32m HRPs used in Scrypto, they are used to specify the type of entity being addressed in a human readable way. For example, account_ or resource_. Any entity specifier starting with internal_ is of a non-global "internal" entity, and can’t be interacted with directly, aside from very specific instances, such as vault recalls/freezes. A full list of entity specifiers are given in hrpset.rs (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-common/src/address/hrpset.rs) . Network Specifiers The address HRP contains a network specifier which is used to specify which network the entity is on. The table below contains all of the relevant network specifiers: Network Name Network Specifier Mainnet rdx Local Simulator sim Testnet with hex id n tdx_n_ Stokenet (main public testnet) tdx_2_ Display of Resource Addresses and Non-Fungibles Please see the Standards section for recommendations on Displaying Resource Addresses (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/ui-ux-standards/resource-address-display.md) and Displaying Non-Fungibles (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-display.md) in UIs. ## Integrator Concepts URL: https://radix.wiki/developers/legacy-docs/reference/babylon-technical-concepts/babylon-technical-concepts Updated: 2026-02-18 Summary: Legacy documentation: Integrator Concepts Reference > Integrator Concepts — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/README.md) ## Displaying Resource Addresses URL: https://radix.wiki/developers/legacy-docs/reference/standards/ui-ux-standards/resource-address-display Updated: 2026-02-18 Summary: All types of assets issued on the Radix Network take the form of resources. This article covers the recommendations for how fungible and non-fungible resource a Reference > Standards > UI/UX Standards > Displaying Resource Addresses — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/ui-ux-standards/resource-address-display.md) All types of assets issued on the Radix Network take the form of resources. This article covers the recommendations for how fungible and non-fungible resource addresses are displayed to the user, in websites and front-end applications. You may also wish to read the specific guide on displaying individual non-fungible global ids (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-display.md) . Shortened display of Resource Addresses Resources come in two kinds: fungible and non-fungible. By their nature, they function somewhat differently, and they are addressed differently. Resources – whether fungibles (tokens) or non-fungibles (NFTs) – are defined by their resource manager. This is the thing that holds the behaviors, properties, and metadata that apply to all of the tokens and NFTs of that type. We can refer to a resource (or its resource manager) by its unique resource address, which is generated automatically by the system. A resource address looks something like this (see the addresses (addresses) article for full details): resource_rdx1t4upr78guuapv5ept7d7ptekk9mqhy605zgms33mcszen8l9fac8vf As with all entity addresses, when displaying them for users, developers should abbreviate them in a consistent way. The standard is to abbreviate using the first 4 and last 6 characters separated by an ellipsis (4…​6). For example, a friendly list of resources for a user might show something like this: Dallas Mavericks Tickets (reso…​8acdfi) But of course, when a user clicks a copy link for a resource address (even if abbreviated), the whole address should be always copied. Any resource, fungible or non-fungible, may have metadata set on its resource manager. Metadata is a key-value store, with keys always being strings, and values being one of a set number of types. Metadata is where the creator can specify things like the name and symbol of a resource, and have it displayed in a wallet or explorer. Ultimately anything goes for metadata, but a standard set of recommended standards for metadata are provided, which the Radix Wallet adopts. Special Resource Classifications The Radix Wallet, Dashboard, and dApps may show other kinds of resources, like “badges” or “pool units”. These are just special classifications of fungible or non-fungible resources, and so they aren’t addressed any differently than the above. The wallet or dashboard will be able to identify them with special metadata (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) , and the Gateway may make it easy to view these special classifications for convenience. ## UI/UX Standards URL: https://radix.wiki/developers/legacy-docs/reference/standards/ui-ux-standards/ui-ux-standards Updated: 2026-02-18 Summary: Legacy documentation: UI/UX Standards Reference > Standards > UI/UX Standards — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/ui-ux-standards/README.md) ## Displaying Non-fungible data URL: https://radix.wiki/developers/legacy-docs/reference/standards/non-fungible-standards/non-fungible-data-for-wallet-display Updated: 2026-02-18 Summary: To display a non-fungible, you will want to display its id(non-fungible-display.md), and possibly display its associated data. Reference > Standards > Non-fungible Standards > Displaying Non-fungible data — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-data-for-wallet-display.md) To display a non-fungible, you will want to display its id (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-display.md) , and possibly display its associated data. This article captures the rules the Wallet uses to display non-fungible data. Other clients may choose their own representations of non-fungible data, but we recommend the standardized interpretation of certain fields. What is Non-fungible data? Non-fungibles are transferred individually, and each holds its own data. This data is not the metadata that is set on resource managers, but it fills a similar purpose for individual non-fungibles. This data must fit with a data structure defined at initial creation of the resource. For example, I might have a set of "Dallas Mavericks Tickets" NFTs where the data struct says that each individual ticket non-fungible must have a game_date and seat_number. Each separate non-fungible under this resource will potentially have a different value for game_date and seat_number. Non-fungible data is encoded as Scrypto SBOR (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/scrypto-sbor/README.md) , and the engine ensures it matches the schema of the Non-Fungible data type, defined on the Non-Fungible Resource. The engine also validates that the top-layer is of a Tuple type with named fields. This data structure is recursive, and can be displayed by parsing the annotated programmatic SBOR JSON (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/sbor-textual-representations/sbor-programmatic-json.md) from the Gateway API. Some top-level field names have standardized purposes and are discussed below. But the other fields can be displayed by mapping the returned SBOR json into a tree-based data view. Individual NFT unit non-fungible data Clients will typically use non-fungible data for display in a very similar way as metadata, and so we list some standards of usage in this document. Wherever non-fungible resource managers (which may be thought of as “collections” of NFTs for users) are shown in the wallet, the associated individual NFT units (non-fungibles) are shown grouped beneath the resource manager as the heading. Non-fungible data affects how each of these units are displayed under the heading. Non-fungible data field Type Intended Use name String Simple name of this particular non-fungible unit as intended to be displayed, with capitals. description String Summarized description of this particular non-fungible unit as intended to be displayed, with capitals. key_image_url Url Location of the image to be associated with this non-fungible unit. [other arbitrary data fields] Any Additional user-facing structured data; or programmatic data associated with this non-fungible. Name Shown as an additional identifier along with the NonFungibleLocalId (which is universally used as the unique identifier for the non-fungible by the wallet). For example: Mavs vs Lakers 12-25-2024 The name may be truncated after 32 characters. Description Not shown as an identifier for the NFT - only listed under the general detailed information for the non-fungible. For example: This NFT grants access to the American Airlines center for the 12-25-2024 matchup between…​ Intended to provide a short, simple description in the context of a "get info" style screen. As with other metadata fields, no formatting is supported. Line breaks, extra whitespace, and other types of formatting tags should not be used - they may either be shown explicitly as text or ignored. The description may be truncated after 256 characters. Key Image URL A non-fungible may often represent or be closely associated with an image - whether a unique piece of art, a profile picture, or simply an image that gives a greater sense of the purpose of the NFT. The image at the URL specified here, if present, will be used in the wallet as the primary visual representation of the non-fungible, displayed prominently as a user browses their assets. At preview level, images may be shown cropped. If wider than 16:9 (W:H), it will be cropped into the largest possible 16:9 rectangle. If taller than 1:1, it will be cropped into the largest possible 1:1 square. - A detail view will show it at full aspect ratio. - Supported types: JPG, PNG, SVG, GIF (including animated), WEBP (including animated) - May be ignored for file sizes above 10MB - Animated images: The total area of all frames together must be less than 100000000. For example, if the animation resolution is 2000×2000, the maximum number of animation frames allowed is 25. Also note that most decentralized data storage platforms offer a standard URL to individual files. This means that a non-fungible on Radix may be linked to decentralized image storage in this way using this data field. Future extensions A later expansion of the metadata standard may include the ability to specify more - such as different images for different sizes and usage, explicit mutability and aspect ratio, and verifiable hashes. We start with a simple single URL field standard for now, which the Radix Wallet will always intend to support, and will expand the standard (and its adoption in the Radix Wallet) from developer community feedback. Some NFT creators may wish to set additional data fields pointing to richer visual data (like multiple sizes, video, or 3D data), or include verifying data like a hash of the image in question. In the first instance of the Radix Wallet, this information would be shown among the “arbitrary data fields” below, with this key_image_url as the only field given special visual presentation. This wallet support may be expanded and improved as community standards emerge. Additional fields The wallet may truncate these as needed. A nonfungible resource manager specifies a specific set of available metadata fields for all of its nonfungible resources via the schema. In addition to the specially-handled metadata above, the wallet will show any additional fields as key / value pairs if the value are of simple types like strings, integers, and addresses. More complex data types may or may not be shown by the wallet necessarily. e.g. Section = G Seat number = 44 Game date = 12-25-24 Not shown in this list is the NonFungibleLocalId even though this piece of data may be set by the NFT’s creator and will be shown prominently in the Radix Wallet. The NonFungibleLocalId is used to address a specific non-fungible, must be unique, and may be set by the NFT creator. Because the creator may set it, it is a useful way to “brand” a given non-fungible. For example, the NonFungibleLocalId might be a string set to “mavs_lakers_122524_44G”. For more details on how we recommend non-fungible ids to be displayed, please see the non-fungible display (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-display.md) article. ## Displaying Non-fungibles URL: https://radix.wiki/developers/legacy-docs/reference/standards/non-fungible-standards/non-fungible-display Updated: 2026-02-18 Summary: This article covers the recommendations for how "non-fungibles" (individual units of non-fungible resources) are displayed to a user, and covers display of thei Reference > Standards > Non-fungible Standards > Displaying Non-fungibles — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-display.md) This article covers the recommendations for how "non-fungibles" (individual units of non-fungible resources) are displayed to a user, and covers display of their global id and their data. You may also wish to see the advice on displaying fungible and non-fungible resource addresses (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/ui-ux-standards/resource-address-display.md) , for when you wish to display the address of a non-fungible resource without a specific local id. Non-fungible global id "address" display Non-fungible resources are special in that each NFT "unit" also has its own unique identity. We call these individual units non-fungibles. We can refer to a non-fungible by its unique non-fungible global ID. This ID has two parts: - The resource address of the resource manager it belongs to - A non-fungible local ID that is unique within that set of resources. These two parts of the global ID are concatenated as a single string, separated by a colon, to make it easy for users to read and copy-paste them in a wallet, explorer, or exchange UI. When put together, a full non-fungible global ID (using a string type non-fungible local ID) looks something like: resource_1qlq38wvrvh5m4kaz6etaac4389qtuycnp89atc8acdfi: This is the full global ID used to look up a specific non-fungible in, say, the Radix Dashboard; the non-fungible local ID ticket_19206 alone is not enough. This is because, for example, there could be a different resource that also includes a ticket_19206. The local ID is only unique within the set of non-fungibles of that specific non-fungible resource. Non-fungible local IDs are defined by the creator and are of one of a few supported types. All non-fungibles of a given non-fungible resource use the same type of local ID. Each type is indicated by a special local ID format: Type Indicated by Example local ID String ([A-Za-z0-9_]{1,64}) < > Integer (Any U64) # # #203478274# Bytes (1 to 64 bytes, displayed in hexadecimal) [ ] [deadbeef] RUID (32 random bytes) { } {0000000000000000-0000000000000000-0000000000000000-0000000000000000} can be displayed shortened as {0534…1422} When a non-fungible is shown in the Radix Wallet, Dashboard or your dApps, it may often be shown as a member of the top-level resource (manager). In cases like this, it’s acceptable to identify the non-fungible to the user by its non-fungible local ID alone. It may even be presented to the user as its "ID" for simplicity. But when the user clicks a copy link for an individual non-fungible’s address, the whole global ID should be copied – combining the resource global ID and local ID as shown above. Unlike resource addresses, the non-fungible’s local ID portion should not generally be abbreviated since it may often be intended to be readable by users. An example display for a user might look like: Dallas Mavericks Tickets (reso…​8acdfi) 12/25/22 - Seat 13B (ticket_19206) 12/25/22 - Seat 13C (ticket_19207) However, if it is necessary to show a non-fungible global ID on its own (without the context of the associated resource manager), it is acceptable to display the address in abbreviated form like this (although this would be an unusual case): Dallas Mavericks Tickets, 12/25/22 - Seat 13B (reso…​8acdfi:) The exception to showing local IDs in full is the RUID type of local ID. Since these are very long and not particularly human readable, it is acceptable to abbreviate them, using a 4…4 format. So for example, a non-fungible’s global ID might be shown – abbreviating both resource portion and local ID portion – as follows: reso…8acdfi:{0534…1422} Non-fungible data display See displaying non-fungible data (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-data-for-wallet-display.md) . ## Non-fungible Standards URL: https://radix.wiki/developers/legacy-docs/reference/standards/non-fungible-standards/non-fungible-standards Updated: 2026-02-18 Summary: Legacy documentation: Non-fungible Standards Reference > Standards > Non-fungible Standards — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/README.md) ## Metadata for Verification URL: https://radix.wiki/developers/legacy-docs/reference/standards/metadata-standards/metadata-for-verification Updated: 2026-02-18 Summary: An important aspect of Radix metadata standards is the concept of a dApp Definition. This is an account which acts as the unique on-ledger identifier for a dece Reference > Standards > Metadata Standards > Metadata for Verification — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) Introduction An important aspect of Radix metadata standards is the concept of a dApp Definition. This is an account which acts as the unique on-ledger identifier for a decentralized application. The dApp definition provides information about the dApp, and is a hub to establish trusted two-way links with other entities (resources, components and packages) and websites which form part of the dApp. This allows the Radix wallet to give appropriate guidance to the user, for example by telling the user the dApps they are interacting with during transaction signing. There is a tool on the developer console (https://console.radixdlt.com/configure-metadata) to help with setting up and validating the entity metadata. See the dApp Definition Setup Guide (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-definition-setup.md) for more information. The resolved links are returned by the /state/entity/details endpoint (https://radix-babylon-gateway-api.redoc.ly/#operation/StateEntityDetails) on the Gateway API (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) . Overview The dApp definition account forms the “hub” in a hub-and-spoke model. It acts as a container for metadata, and also for different types of links, discussed in more depth below. - Direct Links are links between a dApp and an entity, configured at each end with metadata. - Origin Links are links between a dApp definition and an https origin. On the dApp definition side it is configured by metadata, and on the origin side it is configured with a /.well-known/radix.json file hosted at the root. - Blueprint Links are indirect links between a dApp definition and all components instantiated from an approved blueprint. - Partner Links are links between two different dApp definition accounts, configured at each end with metadata. - The Primary Locker is a link from a dApp definition to a single account locker (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/locker.md) , and used by the wallet to auto-discover claims for the user’s accounts for dApps they have previously interacted with. Metadata Standards for Verification The following details the various fields of metadata that should be configured on your on-ledger entities that make up your dApp – including the dApp Definition “hub”, and the various resource, component, and package “spokes”. dApp Definition Accounts To start, the dApp Definition account is configured with information metadata, according to the Metadata Standards for Wallet Display (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) : - The name field should be set to a String - The description field should be set to a String - The icon_url field should be set to a Url - The tags field should be set to a String[] The following metadata entries should then be set for verification purposes: Metadata field Type Contents Gateway & Radix Wallet treatment account_type string “dapp definition” Indicates that the metadata of this account should be read as a dapp definition, including looking for the metadata below, rather than treated as a user account. claimed_entities Vec
Addresses of related entities. - Claims ownership of packages, components, and resources - confirmed by the entity’s metadata pointing back to this dApp Definition address. - claimed_entities beyond the first 100 set on this account may be ignored. claimed_websites Vec Origins (URLs without any path) of associated websites. Note that origin metadata values must not end with /.A / at the end of a url would cause an invalid origin error. - Claims ownership of websites - confirmed by looking up an expected .well-known/radix.json file at the claimed website origin. - claimed_websites beyond the first 10 set on this account may be ignored. dapp_definitions Vec
dApp definition account addresses. - Claims association with another dApp Definition - confirmed by that dApp Definition pointing back to this one in the same way. This is more of a “peer” association than it is “ownership”. - dapp_definitions beyond the first 100 set on this resource may be ignored. account_locker Address Primary Account Locker for the dApp - Registers the given account locker as the primary account locker for the dApp. - The account locker should also be two-way linked to the dApp, using claimed_entities entry on the dApp definition, and dapp_definition entry on the account locker. - A correctly two-way-linked account_locker can be used by wallets to check for outstanding claims from dApps the user has previously interacted with. Resources Warning: Plural field name Unlike components and packages, a resource is permitted to link to multiple dApp definitions, so it uses the plural metadata name dapp_definitions Metadata field Type Contents Gateway & Radix Wallet treatment dapp_definitions Vec
dApp definition account address(es) that have claimed this as a related resource, to confirm the claim. Defines a direct link to the dApp if the resource’s address is also present in the dApp Definition Account’s claimed_entities metadata. Allows display of which dApp(s) this resource is associated with. dapp_definitions beyond the first 5 set on this resource may be ignored. Packages Metadata field Type Contents Gateway & Radix Wallet treatment dapp_definition Address The dApp definition account address. Defines a direct link to the dApp if the component’s address is also present in the dApp Definition Account’s claimed_entities metadata. enable_blueprint_linking Vec The names of blueprints in this package for which all components instantiated from that blueprint should be treated as being part of the dApp. If the package is two-way linked to a dApp definition, then any components instantiated from the defined blueprint names are treated as being part of the dApp (unless they directly link to a different dApp). dApp owners should only configure this for blueprints where they wish all such components to be associated with their dApp: - Their constructors have used function access rules (assign-function-accessrules) to protect the constructor - The constructors are public, but regardless all such components can be treated as part of the dApp. Components Direct Link A direct link can be defined between a dApp definition and a scrypto component or “ native component (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/README.md) ” such as a pool, account, validator or access controller. Metadata field Type Contents Gateway & Radix Wallet treatment dapp_definition Address The dApp definition account address. Defines a direct link to the dApp if the component’s address is also present in the dApp Definition Account’s claimed_entities metadata. Blueprint Link A component is treated as having a blueprint link to a dApp definition if both: - Its package is two-way linked to the dApp definition - Its package has the enable_blueprint_linking metadata entry set to include the component’s blueprint name. Resolved Link The “resolved link” is the primary link shown for a dApp. It resolves as the direct link, if it exists; otherwise the blueprint link. The resolved link is used by the wallet to show the associated dApps using transaction signing. Linking website to a dApp Definition It is required by the Radix Wallet that a dApp Website have a correctly configured dApp Definition. In development, you can configure your wallet to developer mode to disable these checks Linking a website origin back to a dApp Definition To declare a website (or websites) as part of a dApp, the dApp Definition sets a claimed_websites metadata field (which should have a value of Vec). If more than one website is specified, the first website listed should be the preferred entry point for users. A client like the Radix Wallet might link to this first website if the full listing of websites is not shown. For websites, the developer must create a radix.json file and expose it according to the RFC 8615: Well-known URI standard (https://www.rfc-editor.org/rfc/rfc8615) at /.well-known/radix.json under their origin. When a request is made by a given dApp defintion from an origin to the Radix wallet, the Radix wallet will read this file and check for a two-way link with the given dApp definition. For example, if a dApp Definition metadata identified the origin https://test.topradixdevs.io it would look for confirmation of the link in an entry at https://test.topradixdevs.io/.well-known/radix.json Origins and www A dApp Definition is connected to one or more “claimed_websites”. From these, the origin (https://developer.mozilla.org/en-US/docs/Glossary/Origin) is extracted, and used for the below two-way verification. This follows the web security model of the same-origin policy (https://developer.mozilla.org/en-US/docs/Web/Security/Same-origin_policy) , where security-related data is sandboxed by origin. Note in particular that https://www.my-cool-radix-dapp.com is a separate origin to https://my-cool-radix-dapp.com - if you wish to support users using both, you need to either: - (a) Register both as claimed websites on your dApp Definition. The first in the list in the link will be the preferred link which is displayed to users. - (b) Choose a canonical origin (e.g. with the www.), and set up a redirect from the other origin. Only the canonical origin needs to then be registered. Typically (b) is a better choice as it fits better with the same-origin policy for other web security concerns, but (a) is also possible. A radix.json file for a domain hosting 2 dApps would be structured like this: { "dApps":[ { "dAppDefinitionAddress": "account_rdx1qluj95c7hduuvrt5jctcj059qfavznygcssrgnuk0k6q3cktjy" }, { "dAppDefinitionAddress": "account_rdx1q7u4e5vry33qyddxpm0fhqspl5ruym7zx9kj9h0d52qsf4jc9p" } ] } Note, while this means that one domain can host multiple dApps, there is no enforcement of separation of these dApps – it is assumed that each domain owner is in control of all dApps hosted on that domain. Each dApp must be listed in the radix.json file at the root level of their domain to be considered valid, and it is up to each dApp to correctly declare which dApp Definition it is associated with under that domain. Using this information on the website and on the dApp Definition, the Radix Wallet can verify that requests from websites are genuine. The Wallet will know each dApp by its dApp Definition address, and no website can falsely claim to be associated with a dApp Definition that it does not actually control. Gumball Club Example - dApp URL is: https://gumball-club.radixdlt.com/ (https://gumball-club.radixdlt.com/) which gives an origin of https://gumball-club.radixdlt.com (https://gumball-club.radixdlt.com) - Well Known radix.json file: https://gumball-club.radixdlt.com/.well-known/radix.json (https://gumball-club.radixdlt.com/.well-known/radix.json) - Dashboard Examples: Current dApp Definition metadata (https://dashboard.radixdlt.com/account/account_rdx12xuhw6v30chdkhcu7qznz9vu926vxefr4h4tdvc0mdckg9rq4afx9t/metadata) | Transaction which set up the claimed_websites (https://dashboard.radixdlt.com/transaction/txid_rdx15klz9t3rcczq4r62p2xv0x83we8rmqf6wxd5g474jc2s28k64w6sqw8dzk/details) ## Metadata for Wallet Display URL: https://radix.wiki/developers/legacy-docs/reference/standards/metadata-standards/metadata-for-wallet-display Updated: 2026-02-18 Summary: These metadata standards are intended to assist clients like the Radix Wallet, Radix Dashboard, exchanges, and more, to display useful information about the ass Reference > Standards > Metadata Standards > Metadata for Wallet Display — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) Introduction These metadata standards are intended to assist clients like the Radix Wallet, Radix Dashboard, exchanges, and more, to display useful information about the assets a user holds, and the on-ledger things a user interacts with, like dApps, components, and blueprints. Developers can feel free to add and use whatever metadata they like, but these standards are recommended as a “least common denominator” standard for the basic attributes that the Radix Wallet and any other client would expect to be present in a standard way. In short, if you set these things on the things you create, you can be sure of having clear and excellent presentation for users. Resources Metadata set on tokens and NFTs helps identify those assets and provide customized presentation to wallet users. Note that, for NFTs, this is metadata set on the NFT resource itself (or “resource manager”) that refers to all individual non-fungibles that belong to that resource. Separate standards for setting data on individual non-fungibles is below. Metadata field Type Intended Use Radix Wallet Treatment name string Simple name of the asset as intended to be displayed, with capitals. - Shown as primary readable identifier for token if symbol is not present (for nonfungible tokens), and as a more complete human-readable name of the token. - eg. Bitcoin - eg. Dallas Mavericks Tickets - May be truncated after 32 characters. symbol (fungible resources only) string - A short unique identifier, often used on exchanges or to label an amount of the token (eg. 2.503 BTC). - No more than 5 characters. - Alphanumeric, all caps, no whitespace. - Shown all-caps as most-preferred singular method of identifying fungible tokens. Symbol is ignored for non-fungible resources - eg. BTC. - The wallet will display the symbol all-caps. - Symbols not conforming to the intended string format may be ignored. - May be truncated after 5 characters. description string Summarized description of the asset and its purpose as intended to be displayed, with capitals. - Not shown as a primary readable identifier for the token - but listed under the detailed information for the token. - eg. Open source P2P money. - eg. Your NFT tickets to see the Dallas Mavericks at the American Airlines Center. - Intended to provide a short, simple description in the context of a "get info" style screen. As with other metadata fields, no formatting is supported. Line breaks, extra whitespace, and other types of formatting tags should not be used - they may either be shown explicitly as text or ignored. - May be truncated after 256 characters. tags Vec - List of descriptive tags for the resource. - Alphanumeric, no caps, dashes for spaces, no whitespace. - The wallet will allow the user to choose which tags are meaningful to them, and display these tags on resources and allow grouping by them. - eg. badge, gaming, loan-positions, PFP - The wallet will default to showing the badge tag (indicating an asset that is intended to be used for authorization). It will show a warning when the user is viewing a transaction that would transfer away a resource tagged badge. - All other tags will be for the user to opt-in to see flagged on their resources in a given account. - The wallet will default to showing the badge tag (indicating an asset that is intended to be used for authorization). All other tags will be for the user to opt-in to see flagged on their resources in a given account. - Tags not conforming to the intended string format may be ignored. - Dashes in tags may be replaced by spaces for friendly display for users. - May be truncated after 16 characters. - tags beyond the first 100 set on this resource may be ignored. icon_url URL - Location of image to be used to represent the token. - Should be designed for expected presentation as a circle - Shown as a primary visual identifier for the token in various places, if present. A placeholder is used if nothing set. - The wallet will load and scale this image for specific presentation, cropping it into a 1:1 circle. - Supported types: JPG, PNG, GIF, WEBP, SVG - Images without proper filename extensions or of format other than PNG, JPG, GIF, WEBP, or SVG format may be ignored. - Note: A later expansion of the metadata standard may include the ability to specify more - such as different images for different sizes and usage, explicit mutability and aspect ratio, and verifiable hashes. We start with a simple single URL field standard for now, which the Radix Wallet will always intend to support, and will expand the standard (and its adoption in the Radix Wallet) from developer community feedback. info_url URL Direct link to an informational webpage. If provided, the wallet will present this URL that may route users to an informational webpage about the token in the detailed page view of the token. Non-fungible Data Standards Non-fungible data is not technically metadata, but often displayed similarly. These are captured via the non-fungible data standards (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-data-for-wallet-display.md) . Components and Blueprint Packages Unlike most of the metadata standards, the fields here are not primarily intended for use in the wallet, but rather for use by network browsing and searching interfaces such as the Radix Dashboard, or perhaps future Scrypto code browsing and discovery services. It might be used along with other non-metadata information about the component or package, such as the methods available or royalty fees charged for its usage. Metadata field Type Intended Use Radix Wallet Treatment name string Simple name of the component as intended to be displayed, with capitals. - eg. AwesomeOracle Price/Weather - May be truncated after 32 characters. description string Summarized description of the component and its usage as intended to be displayed, with capitals - eg. Official AwesomeOracle component providing data about asset prices and weather conditions. - Intended to provide a short, simple description in the context of a "get info" style screen. As with other metadata fields, no formatting is supported. Line breaks, extra whitespace, and other types of formatting tags should not be used - they may either be shown explicitly as text or ignored. - May be truncated after 256 characters. tags Vec - List of descriptive tags for the component. - Alphanumeric, no caps, dashes for spaces, no whitespace. - eg. oracle, price, weather - Tags not conforming to the intended string format may be ignored. - Dashes in tags may be replaced by spaces for friendly display for users. - May be truncated after 16 characters. - tags beyond the first 100 set on this resource may be ignored. Special Native Components and Resources The Radix Network includes a selection of native blueprints from which users can instantiate native components with known and trustable behavior. Some of these components also mint their own associated resources. Below are the metadata standards a developer should consider to customize how these components and resources are presented in the Radix Wallet and other clients. Metadata standards for verification (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) may also be applied to these components and resources. “Account” (for users) system component metadata Accounts are the elemental holder of resources. The wallet interacts with them directly, or wraps them in an access controller component, which works by holding the owner badge issued by the account. No metadata is expected to be set on simple account components for users. “Account” (for dApp definitions) system components metadata In addition to user accounts in the wallet, accounts may be used as repositories for metadata that define the collection of things that make up a “dApp” so that they have a single on-ledger identity that they may be registered to, and recognized by in the wallet. This registration is done by this component claiming ownership of other components, packages, resources, and more – and those entities in turn confirming that claim by setting metadata that points to this account’s address. This is an account because this component may also need to hold resources of various types (in particular badges). Metadata field Type Intended Use Radix Wallet Treatment name string Simple name of the dApp as intended to be displayed, with capitals. - e.g. CollaboFi - Intended to provide a short, simple description in the context of a "get info" style screen. As with other metadata fields, no formatting is supported. Line breaks, extra whitespace, and other types of formatting tags should not be used - they may either be shown explicitly as text or ignored. - May be truncated after 32 characters. description string Summarized description of the dApp as intended to be displayed, with capitals. - eg. CollaboFi is a platform for collaborative community and creative crowdfunding. - May be truncated after 256 characters. tags Vec - List of descriptive tags for the dApp. - Alphanumeric, no caps, dashes for spaces, no whitespace. - eg. dao, marketplace, music, art - Add clear, concise tags to improve the discoverability of your dApp in Wallet’s dApp Directory. - May be truncated after 16 characters. - tags beyond the first 100 set on this resource may be ignored. icon_url URL - Location of image to be used to represent the dApp. - Should be designed for expected presentation as a square. - Shown as a primary visual identifier for the dApp in various places, if present. A placeholder is used if nothing set. - The wallet will load and scale this image for specific presentation, cropping it into a 1:1 square. - Supported types: JPG, PNG, GIF, WEBP, SVG - Images without proper filename extensions or of format other than PNG, JPG, GIF, WEBP, or SVG format may be ignored. - Note: A later expansion of the metadata standard may include the ability to specify more - such as different images for different sizes and usage, explicit mutability and aspect ratio, and verifiable hashes. We start with a simple single URL field standard for now, which the Radix Wallet will always intend to support, and will expand the standard (and its adoption in the Radix Wallet) from developer community feedback. dapp_category string One of: - defi - utility - dao - nft - meme - Used to categorize the dApp in Wallet’s dApp Directory. - Optional Field. Use this to override the default category configured during the dApp’s initial listing in dApp Directory. “Identity” system component metadata This system component is specifically intended to be used as a repository for a public key that can be used for web3 login verification as part of the ROLA system. The Radix Wallet creates and associates an identity component with each Persona that a user creates there to be used for web3 logins. The Identity component holds no resources. The login mechanism is designed to be pseudo-anonymous, therefore there is little reason for general metadata to be stored here (and there is good reason to not store data there). Identities have no standard metadata fields. “Validator” system component metadata A validator node-runner will instantiate a system Validator component that holds its stake and allows its owner to define various pieces of metadata that might be visible on a staking dashboard or the Radix Wallet. Other more functional aspects of Validator configuration, such as setting the validator’s fee, are set via component methods, not metadata. Metadata field Type Intended Use Radix Wallet Treatment name string Simple name of the validator as intended to be displayed, with capitals. - eg. Prime Stake - May be truncated after 32 characters. description string Summarized description of the validator as intended to be displayed, with capitals. - eg. Your top choice for reliable, community-driven Radix Network validation. - Intended to provide a short, simple description in the context of a "get info" style screen. As with other metadata fields, no formatting is supported. Line breaks, extra whitespace, and other types of formatting tags should not be used - they may either be shown explicitly as text or ignored. - May be truncated after 256 characters. icon_url URL - Location of image to be used to represent the validator (not to be confused with individual non-fungibles). - Should be designed for expected presentation as a square. - Shown as a primary visual identifier for the validator in various places, if present. A placeholder is used if nothing set. - The wallet will load and scale this image for specific presentation, cropping it into a 1:1 square. - Supported types: JPG, PNG, GIF, WEBP, SVG - Images without proper filename extensions or of format other than PNG, JPG, GIF, WEBP, SVG format may be ignored. - Note: A later expansion of the metadata standard may include the ability to specify more - such as different images for different sizes and usage, explicit mutability and aspect ratio, and verifiable hashes. We start with a simple single URL field standard for now, which the Radix Wallet will always intend to support, and will expand the standard (and its adoption in the Radix Wallet) from developer community feedback. info_url URL Direct link to an informational webpage. If provided, the wallet will present this URL that may route users to an informational webpage about the token in the detailed page view of the token. "Liquid Staking Unit" and "Stake Claim" resources for Validators Validator components automatically mint liquid stake unit tokens and stake claim NFTs. It is not expected for the validator node-runner to set any metadata for these resources, and the Radix Wallet will show validator information taken from the validator component metadata as above. “Pool” component metadata A developer who wishes to use a pool will instantiate one from a native Pool blueprint (pool-component) , and they will have the ability to set metadata on it as they wish. The metadata standards for pool components are the same as those for other components above. "Pool Unit" resource metadata Pool components automatically mint pool unit tokens. Instantiators of pool components will also be given the ability to set the metadata on this resource as they wish. The metadata standards for these resources are the same as other resources above (metadata-for-wallet-display.md#resources) . ## Metadata Standards URL: https://radix.wiki/developers/legacy-docs/reference/standards/metadata-standards/metadata-standards Updated: 2026-02-18 Summary: Legacy documentation: Metadata Standards Reference > Standards > Metadata Standards — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/README.md) ## Standards URL: https://radix.wiki/developers/legacy-docs/reference/standards/standards Updated: 2026-02-18 Summary: Legacy documentation: Standards Reference > Standards — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/README.md) ## Well-known Addresses URL: https://radix.wiki/developers/legacy-docs/reference/well-known-addresses-1 Updated: 2026-02-18 Summary: Legacy documentation: Well-known Addresses Reference > Well-known Addresses — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/well-known-addresses-1.md) Mainnet System Native Addresses { "network_name": "mainnet", "network_id": 1, "network_hrp_suffix": "rdx", "well_known_addresses": { "xrd": "resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd", "secp256k1_signature_virtual_badge": "resource_rdx1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxsecpsg", "ed25519_signature_virtual_badge": "resource_rdx1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxxed25sg", "system_transaction_badge": "resource_rdx1nfxxxxxxxxxxsystxnxxxxxxxxx002683325037xxxxxxxxxsystxn", "package_of_direct_caller_virtual_badge": "resource_rdx1nfxxxxxxxxxxpkcllrxxxxxxxxx003652646977xxxxxxxxxpkcllr", "global_caller_virtual_badge": "resource_rdx1nfxxxxxxxxxxglcllrxxxxxxxxx002350006550xxxxxxxxxglcllr", "package_owner_badge": "resource_rdx1nfxxxxxxxxxxpkgwnrxxxxxxxxx002558553505xxxxxxxxxpkgwnr", "validator_owner_badge": "resource_rdx1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxvdrwnr", "account_owner_badge": "resource_rdx1nfxxxxxxxxxxaccwnrxxxxxxxxx006664022062xxxxxxxxxaccwnr", "identity_owner_badge": "resource_rdx1nfxxxxxxxxxxdntwnrxxxxxxxxx002876444928xxxxxxxxxdntwnr", "package_package": "package_rdx1pkgxxxxxxxxxpackgexxxxxxxxx000726633226xxxxxxxxxpackge", "resource_package": "package_rdx1pkgxxxxxxxxxresrcexxxxxxxxx000538436477xxxxxxxxxresrce", "account_package": "package_rdx1pkgxxxxxxxxxaccntxxxxxxxxxx000929625493xxxxxxxxxaccntx", "identity_package": "package_rdx1pkgxxxxxxxxxdntyxxxxxxxxxxx008560783089xxxxxxxxxdntyxx", "consensus_manager_package": "package_rdx1pkgxxxxxxxxxcnsmgrxxxxxxxxx000746305335xxxxxxxxxcnsmgr", "access_controller_package": "package_rdx1pkgxxxxxxxxxcntrlrxxxxxxxxx000648572295xxxxxxxxxcntrlr", "transaction_processor_package": "package_rdx1pkgxxxxxxxxxtxnpxrxxxxxxxxx002962227406xxxxxxxxxtxnpxr", "metadata_module_package": "package_rdx1pkgxxxxxxxxxmtdataxxxxxxxxx005246577269xxxxxxxxxmtdata", "royalty_module_package": "package_rdx1pkgxxxxxxxxxryaltyxxxxxxxxx003849573396xxxxxxxxxryalty", "role_assignment_module_package": "package_rdx1pkgxxxxxxxxxarulesxxxxxxxxx002304462983xxxxxxxxxarules", "genesis_helper_package": "package_rdx1pkgxxxxxxxxxgenssxxxxxxxxxx004372642773xxxxxxxxxgenssx", "faucet_package": "package_rdx1pkgxxxxxxxxxfaucetxxxxxxxxx000034355863xxxxxxxxxfaucet", "pool_package": "package_rdx1pkgxxxxxxxxxplxxxxxxxxxxxxx020379220524xxxxxxxxxplxxxx", "transaction_tracker_package": "package_rdx1pkgxxxxxxxxxtxtrakxxxxxxxxx000595975309xxxxxxxxxtxtrak", "locker_package": "package_rdx1pkgxxxxxxxxxlckerxxxxxxxxxx000208064247xxxxxxxxxlckerx", "test_utils_package": "package_rdx1phua8spmaxapwq56stduucrvztk92gxzjy9c98h0qemfjec03k9pjp", "consensus_manager": "consensusmanager_rdx1scxxxxxxxxxxcnsmgrxxxxxxxxx000999665565xxxxxxxxxcnsmgr", "genesis_helper": "component_rdx1cptxxxxxxxxxgenssxxxxxxxxxx000977302539xxxxxxxxxgenssx", "faucet": "component_rdx1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxfaucet", "transaction_tracker": "transactiontracker_rdx1stxxxxxxxxxxtxtrakxxxxxxxxx006844685494xxxxxxxxxtxtrak" } } Stokenet System Native Addresses { "network_name": "stokenet", "network_id": 2, "network_hrp_suffix": "tdx_2_", "well_known_addresses": { "xrd": "resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc", "faucet": "component_tdx_2_1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxyulkzl", "secp256k1_signature_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxcdcdpa", "ed25519_signature_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx3e2cpa", "system_transaction_badge": "resource_tdx_2_1nfxxxxxxxxxxsystxnxxxxxxxxx002683325037xxxxxxxxxcss8hx", "package_of_direct_caller_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxpkcllrxxxxxxxxx003652646977xxxxxxxxxfzcnwk", "global_caller_virtual_badge": "resource_tdx_2_1nfxxxxxxxxxxglcllrxxxxxxxxx002350006550xxxxxxxxxqtcnwk", "package_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxpkgwnrxxxxxxxxx002558553505xxxxxxxxxfzgzzk", "validator_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxyerzzk", "account_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxaccwnrxxxxxxxxx006664022062xxxxxxxxx4vczzk", "identity_owner_badge": "resource_tdx_2_1nfxxxxxxxxxxdntwnrxxxxxxxxx002876444928xxxxxxxxx98tzzk", "package_package": "package_tdx_2_1pkgxxxxxxxxxpackgexxxxxxxxx000726633226xxxxxxxxxehawfs", "resource_package": "package_tdx_2_1pkgxxxxxxxxxresrcexxxxxxxxx000538436477xxxxxxxxxmn4mes", "account_package": "package_tdx_2_1pkgxxxxxxxxxaccntxxxxxxxxxx000929625493xxxxxxxxx9jat20", "identity_package": "package_tdx_2_1pkgxxxxxxxxxdntyxxxxxxxxxxx008560783089xxxxxxxxx4ewu80", "consensus_manager_package": "package_tdx_2_1pkgxxxxxxxxxcnsmgrxxxxxxxxx000746305335xxxxxxxxxqe4rf2", "access_controller_package": "package_tdx_2_1pkgxxxxxxxxxcntrlrxxxxxxxxx000648572295xxxxxxxxxqewm72", "transaction_processor_package": "package_tdx_2_1pkgxxxxxxxxxtxnpxrxxxxxxxxx002962227406xxxxxxxxxnvke82", "metadata_module_package": "package_tdx_2_1pkgxxxxxxxxxmtdataxxxxxxxxx005246577269xxxxxxxxxrpg925", "royalty_module_package": "package_tdx_2_1pkgxxxxxxxxxryaltyxxxxxxxxx003849573396xxxxxxxxxmwc82d", "role_assignment_module_package": "package_tdx_2_1pkgxxxxxxxxxarulesxxxxxxxxx002304462983xxxxxxxxx9fe8ce", "genesis_helper_package": "package_tdx_2_1pkgxxxxxxxxxgenssxxxxxxxxxx004372642773xxxxxxxxxsnkg30", "faucet_package": "package_tdx_2_1pkgxxxxxxxxxfaucetxxxxxxxxx000034355863xxxxxxxxx3heqcz", "pool_package": "package_tdx_2_1pkgxxxxxxxxxplxxxxxxxxxxxxx020379220524xxxxxxxxxe4r780", "transaction_tracker_package": "package_tdx_2_1pkgxxxxxxxxxtxtrakxxxxxxxxx000595975309xxxxxxxxxnvwmul", "locker_package": "package_tdx_2_1pkgxxxxxxxxxlckerxxxxxxxxxx000208064247xxxxxxxxx8jnpz0", "test_utils_package": "package_tdx_2_1phua8spmaxapwq56stduucrvztk92gxzjy9c98h0qemfjec0fuqeng", "consensus_manager": "consensusmanager_tdx_2_1scxxxxxxxxxxcnsmgrxxxxxxxxx000999665565xxxxxxxxxv6cg29", "genesis_helper": "component_tdx_2_1cptxxxxxxxxxgenssxxxxxxxxxx000977302539xxxxxxxxx9cs7tj", "transaction_tracker": "transactiontracker_tdx_2_1stxxxxxxxxxxtxtrakxxxxxxxxx006844685494xxxxxxxxxxzw7jp" } } resim Radix Engine Simulator { "network": "simulator", "network_id": 242, "network_hrp_suffix": "sim", "well_known_addresses": { "xrd": "resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3", "faucet": "component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh", "secp256k1_signature_virtual_badge": "resource_sim1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxwj8qq5", "ed25519_signature_virtual_badge": "resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5", "system_transaction_badge": "resource_sim1nfxxxxxxxxxxsystxnxxxxxxxxx002683325037xxxxxxxxxw002k0", "package_of_direct_caller_virtual_badge": "resource_sim1nfxxxxxxxxxxpkcllrxxxxxxxxx003652646977xxxxxxxxxla870l", "global_caller_virtual_badge": "resource_sim1nfxxxxxxxxxxglcllrxxxxxxxxx002350006550xxxxxxxxxk5870l", "package_owner_badge": "resource_sim1nfxxxxxxxxxxpkgwnrxxxxxxxxx002558553505xxxxxxxxxlah0rl", "validator_owner_badge": "resource_sim1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxjxu0rl", "account_owner_badge": "resource_sim1nfxxxxxxxxxxaccwnrxxxxxxxxx006664022062xxxxxxxxxrn80rl", "identity_owner_badge": "resource_sim1nfxxxxxxxxxxdntwnrxxxxxxxxx002876444928xxxxxxxxxnc50rl", "package_package": "package_sim1pkgxxxxxxxxxpackgexxxxxxxxx000726633226xxxxxxxxxlk8hc9", "resource_package": "package_sim1pkgxxxxxxxxxresrcexxxxxxxxx000538436477xxxxxxxxxaj0zg9", "account_package": "package_sim1pkgxxxxxxxxxaccntxxxxxxxxxx000929625493xxxxxxxxxrn8jm6", "identity_package": "package_sim1pkgxxxxxxxxxdntyxxxxxxxxxxx008560783089xxxxxxxxxnc59k6", "consensus_manager_package": "package_sim1pkgxxxxxxxxxcnsmgrxxxxxxxxx000746305335xxxxxxxxxxc06cl", "access_controller_package": "package_sim1pkgxxxxxxxxxcntrlrxxxxxxxxx000648572295xxxxxxxxxxc5z0l", "transaction_processor_package": "package_sim1pkgxxxxxxxxxtxnpxrxxxxxxxxx002962227406xxxxxxxxx4dvqkl", "metadata_module_package": "package_sim1pkgxxxxxxxxxmtdataxxxxxxxxx005246577269xxxxxxxxx9qjump", "royalty_module_package": "package_sim1pkgxxxxxxxxxryaltyxxxxxxxxx003849573396xxxxxxxxxa0z7mc", "role_assignment_module_package": "package_sim1pkgxxxxxxxxxarulesxxxxxxxxx002304462983xxxxxxxxxrgr7fv", "genesis_helper_package": "package_sim1pkgxxxxxxxxxgenssxxxxxxxxxx004372642773xxxxxxxxxkjv3q6", "faucet_package": "package_sim1pkgxxxxxxxxxfaucetxxxxxxxxx000034355863xxxxxxxxxhkrefh", "pool_package": "package_sim1pkgxxxxxxxxxplxxxxxxxxxxxxx020379220524xxxxxxxxxl5e8k6", "transaction_tracker_package": "package_sim1pkgxxxxxxxxxtxtrakxxxxxxxxx000595975309xxxxxxxxx4d5zd2", "locker_package": "package_sim1pkgxxxxxxxxxlckerxxxxxxxxxx000208064247xxxxxxxxxpnfcn6", "test_utils_package": "package_sim1phua8spmaxapwq56stduucrvztk92gxzjy9c98h0qemfjec00a6qza", "consensus_manager": "consensusmanager_sim1scxxxxxxxxxxcnsmgrxxxxxxxxx000999665565xxxxxxxxxxc06cl", "genesis_helper": "component_sim1cptxxxxxxxxxgenssxxxxxxxxxx000977302539xxxxxxxxxkjv3q6", "transaction_tracker": "transactiontracker_sim1stxxxxxxxxxxtxtrakxxxxxxxxx006844685494xxxxxxxxx4d5zd2", } } ## Address Description URL: https://radix.wiki/developers/legacy-docs/reference/address-description Updated: 2026-02-18 Summary: Radix Engine addresses are Bech32m encoded, where they are made up of a Human Readable Part (HRP), separator, and a base32m encoded da Reference > Address Description — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/address-description.md) High Level Overview Radix Engine addresses are Bech32m (https://github.com/bitcoin/bips/blob/master/bip-0350.mediawiki) encoded, where they are made up of a Human Readable Part (HRP), separator, and a base32m encoded data part which includes a 6 character checksum of the HRP and data. The human readable part is made up of two specifiers that are separated by an underscore ("_"). These are: - Entity Specifier: Specifies the type of entity that this address is for. As an example, in the case of the address being used to address an account component, the entity specifier would be "account". This makes it clear at first glance what the address is meant to be used for. - Network Specifier: Specifies the network that the address is used for. This helps distinguish mainnet address from stokenet address, and so on; making it clear what network the address is use for. The use of the Bech32m addressing scheme comes with a number of benefits: - All addresses are now checksummed and therefore typographical errors can be detected and corrected. - The entity specifier makes it clear what the address is meant to represent. The Radix Engine Toolkit can generate virtual account addresses from a Ed25519 or Secp256k1 public key (../integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-derivation.md#virtual-account-address-from-a-public-key) , removing the need to implement Bech32m encoding/decoding. Encoded Data The data encoded in the Bech32m address is the byte representation of the address, which is an array of 30 bytes that is made up of two main parts: - Entity Byte: This is the first byte in the address and it defines the type of entity being addressed. - Address Bytes: These are the remaining 29 bytes in the array and they are the address of the entity. The supported entity types and their entity byte representation are given in entity_type.rs (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-common/src/types/entity_type.rs) . The byte-representation of addresses is lower-level concept and is not something that you will need to use on day-to-day basis. Entity Specifiers Entity specifiers are the first part of the Bech32m HRPs used in Scrypto, they are used to specify the type of entity being addressed in a human readable way. For example, account_ or resource_. Any entity specifier starting with internal_ is of a non-global "internal" entity, and can’t be interacted with directly, aside from very specific instances, such as vault recalls/freezes. A full list of entity specifiers are given in hrpset.rs (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-common/src/address/hrpset.rs) . Network Specifiers The address HRP contains a network specifier which is used to specify which network the entity is on. The table below contains all of the relevant network specifiers: Network Name Network Specifier Mainnet rdx Local Simulator sim Testnet with hex id n tdx_n_ Stokenet (main public testnet) tdx_2_ Display of Resource Addresses and Non-Fungibles Please see the Standards section for recommendations on Displaying Resource Addresses (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/ui-ux-standards/resource-address-display.md) and Displaying Non-Fungibles (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-display.md) in UIs. ## Environments URL: https://radix.wiki/developers/legacy-docs/reference/environments Updated: 2026-02-18 Summary: There are three main Radix environments. Each environment has a different network ID and each has a recognizable address format. Reference > Environments — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/environments.md) There are three main Radix environments. Each environment has a different network ID and each has a recognizable address format. Mainnet Mainnet is the official Radix Public Network. It's Production. It has real assets, and its history goes back to the Radix Olympia Public Network which went live in July of 2021. Mainnet's network ID is 1, and addresses on Mainnet have _rdx immediately after the human-readable address type. E.g., account addresses on mainnet start with account_rdx. Mainnet Item Value Network Id 1 Logical Name mainnet Dashboard / Explorer https://dashboard.radixdlt.com/ (https://stokenet-dashboard.radixdlt.com/) Developer Console https://console.radixdlt.com/ (https://console.radixdlt.com/) Gateway https://mainnet.radixdlt.com/swagger/ (https://babylon-stokenet-gateway.radixdlt.com/swagger) Native addresses - XRD: resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd - More are detailed in Well-known Addresses (babylon-technical-concepts/well-known-addresses.md#mainnet-system-native-addresses) . Stokenet (Public Test Network) Stokenet is the primary test network for doing Radix development. It typically runs the exact same protocol version as mainnet, though sometimes it also serves as a first deployment for planned releases. Stokenet is intended to be stable, but in exceptional circumstances may be wiped and reset. Anything deployed to Stokenet should be treated as for testing purposes only. Stokenet's network ID is 2, and addresses on Stokenet have _tdx_2_ immediately after the human-readable address type. E.g., resource addresses on Stokenet start with resource_tdx_2_. Stokenet is our primary public test network, intended for integration and testing work. Stokenet Item Value Network Id 2 Logical Name stokenet Dashboard / Explorer https://stokenet-dashboard.radixdlt.com/ (https://stokenet-dashboard.radixdlt.com/) Developer Console https://stokenet-console.radixdlt.com/ (https://stokenet-console.radixdlt.com/) Gateway https://stokenet.radixdlt.com/swagger/ (https://babylon-stokenet-gateway.radixdlt.com/swagger) Docker compose node for Core API development https://github.com/radixdlt/babylon-node/tree/main/testnet-node (https://github.com/radixdlt/babylon-node/tree/main/testnet-node) Native addresses - XRD: resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc - Faucet: component_tdx_2_1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxyulkzl - More are detailed in Well-known Addresses (babylon-technical-concepts/well-known-addresses.md#stokenet-system-native-addresses) Local Environment When using the Radix Engine Simulator (resim), all deployment and interactions take place on your local filesystem and local machine. You can reset your local environment at will. The simulator's network ID is 242. Addresses on the simulator have _sim immediately after the human-readable address type. E.g., package addresses on the simulator start with package_sim. resim Item Value Native addresses - XRD: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 - Faucet: component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh - More are detailed in Well-known Addresses (babylon-technical-concepts/well-known-addresses.md#resim-system-native-addresses) ## Reference URL: https://radix.wiki/developers/legacy-docs/reference/reference Updated: 2026-02-18 Summary: Legacy documentation: Reference Reference — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/reference/README.md) ## Releasing URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/maintenance-1/releasing Updated: 2026-02-18 Summary: When updating the Network Gateway, we may advise that specific steps are necessary. These steps are documented below. Run > Network Gateway > Maintenance > Releasing — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/maintenance-1/releasing.md) Network Gateway Releasing When updating the Network Gateway, we may advise that specific steps are necessary. These steps are documented below. Deploy services in order It is important to deploy new release services in correct order: - Execute Database Migrations. This process completes with zero exit code indicating all migrations, if any, have been executed successfully. Should non-zero exit code be returned deployment procedure must be aborted. - Deploy Data Aggregator and wait for healthy response returned by health check endpoint. - Deploy Gateway API. ## Monitoring URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/maintenance-1/monitoring Updated: 2026-02-18 Summary: The rest of this article explains more detail about the gateway logging configuration, and the various metrics exposed by the gateway. Run > Network Gateway > Maintenance > Monitoring — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/maintenance-1/monitoring.md) Network Gateway Monitoring We recommend installing a Grafana Dashboard (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-setting-up-grafana.md) to help monitor your node and gateway. The rest of this article explains more detail about the gateway logging configuration, and the various metrics exposed by the gateway. Logs Logging configuration follows the ASP.NET paradigms (https://docs.microsoft.com/en-us/aspnet/core/fundamentals/logging/?view=aspnetcore-6.0#dnrvs) . In particular, both the log levels and the logger can be configured in the configuration. By default, a simple one-line console logger is used in development, and a JSON logger is used in production. These can be configured further in the app configuration, as per the ASP.NET guidance (https://docs.microsoft.com/en-us/dotnet/core/extensions/console-log-formatter) - eg this is done in the deployment folder (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/deployment) for optimizing log readability in Docker. An example of configuring for the systemd console logger is given below. { "Logging": { "Console": { "FormatterName": "systemd", "FormatterOptions": { "IncludeScopes": true, "UseUtcTimestamp": true, "TimestampFormat": "yyyy-MM-ddTHH\\:mm\\:ss.fff\\Z " } } } } Health Checks The Data Aggregator and Gateway API have a health check configured on their main ASP .NET url: /health will return a 200 status if all health checks pass, or a 500 if one of more health checks fail. This can be integrated with Kubernetes or other health checking systems. The Data Aggregator has a health check to check for database connectivity, and a custom health check to check for either recent start-up (with 10 seconds) or a ledger extension in the last 20 seconds (this can be configured with the Monitoring.UnhealthyCommitmentGapSeconds parameter). The Gateway API has a health check for each of its Database connections. The health check endpoint will come up after the service loads. For the Data Aggregator, the migrations run before the health check is up, so we recommend the migrations run separately if they are slow-running, see releasing (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/maintenance-1/releasing.md) for more information. Prometheus Metrics The Network Gateway services export metrics in Prometheus format, via metric endpoints; to be picked up by Prometheus. The default endpoints are: - Data Aggregator - http://localhost:1234 (http://localhost:1234) - Gateway API - http://localhost:1235 (http://localhost:1235) But these can be changed with the configuration variable PrometheusMetricsPort. Metric Types Metrics fall into a number of groupings, separated out by distinct prefixes. These are metrics provided by libraries: - dotnet_* - Metrics about the runtime (eg threadpool, known allocated memory) - process_* - Metrics about the process (eg process threads, process memory) - http_request_* and http_requests_* - Metrics about controller actions - httpclient_* - Metrics about requests that the service makes to upstream services (is the full nodes) - aspnetcore_* - Metrics related to ASP.NET (http://ASP.NET) core (eg healthcheck status) There are custom metrics, all prefixed by ng_ (for network gateway): - ng_aggregator_* - metrics about aggregator status - ng_node_fetch_* - metrics about fetching data from a node - ng_ledger_sync_* - metrics about syncing the ledger from full nodes - ng_ledger_commit_* - metrics about committing the agreed ledger to the database - ng_node_ledger_* - metrics about the ledger / state of the full node/s (with node label) - ng_node_mempool_* - metrics about full node mempool/s (or the combination of them) - ng_db_mempool_* - metrics about the MempoolTransactions in the database - ng_construction_transaction_* - metrics relating to construction, submission or resubmission Each service also exposes a /metrics endpoint, at a separate port to the health check / main APIs. This port can be changed with the PrometheusMetricsPort configuration, defaulting to 1234 for the Data Aggregator and 1235 for the Gateway API. Many custom metrics are available which can be used for a comprehensive dashboard. The metrics should include a description to explain how they can be interpreted. The various metrics are documented inline, which can be seen by going to the /metrics endpoint in your browser, or in prometheus. Alerting Alerting should align with your monitoring requirements. The thresholds below may need adjusting for your use case. Some suggested alerts are below: Importance Explanation Possible Alerting Criteria High MoreThanOnePrimary - More than one primary data aggregator (this can cause high levels of errors with both Data Aggregators trying to do the same thing) sum(ng_aggregator_is_primary) != 1 High HighTimeSinceLastLedgerCommit - The DB ledger hasn’t been updated in the last minute time() - ng_ledger_commit_last_commit_timestamp_seconds{container="data-aggregator"} > 60 Medium Resubmission Queue Backlog - This might indicate that resubmissions are delayed ng_db_mempool_transactions_needing_resubmission_total{container="data-aggregator"} > 100 Medium FailingDataAggregatorHealthChecks sum(aspnetcore_healthcheck_status{container="data-aggregator"}) >= 1 Medium FailingGatewayAPIHealthChecks sum(aspnetcore_healthcheck_status{container="gateway-api"}) >= 1 Depending on your use cases, you may also wish to configure alerting on transaction submission or resubmission errors, possibly making use of some of the following metrics: - ng_construction_transaction_submission_request_count, ng_construction_transaction_submission_success_count and ng_construction_transaction_submission_error_count - ng_construction_transaction_resubmission_attempt_count, ng_construction_transaction_resubmission_success_count and ng_construction_transaction_resubmission_error_count ## Maintenance URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/maintenance-1/maintenance-1 Updated: 2026-02-18 Summary: Legacy documentation: Maintenance Run > Network Gateway > Maintenance — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/maintenance-1/README.md) ## Configuration URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/network-gateway-setup/custom-setup/configuration Updated: 2026-02-18 Summary: The Data Aggregator and Gateway APIs are ASP .NET Core 7 services, and can be configured using the ASP .NET Core configuration paradigm( Run > Network Gateway > Setup > Custom Setup > Configuration — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/configuration.md) Network Gateway Configuration App Configuration The Data Aggregator and Gateway APIs are ASP .NET Core 7 services, and can be configured using the ASP .NET Core configuration paradigm (https://learn.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-7.0) , with some slight tweaks. - JSON App settings should be provided in a appsettings.Production.overrides.json in the app’s run directory. - Environment variables to provide configuration values for the Data Aggregator may be prefixed with DataAggregator__ to disambiguate from other environment variables - Environment variables to provide configuration values for the Gateway API may be prefixed with GatewayApi__ to disambiguate from other environment variables - If you wish to provide JSON App settings from a location other than the app’s run directory, you can use the environment variable CustomJsonConfigurationFilePath to point at a JSON file somewhere else in the system, from which settings can be read (assuming the user running the service has access to this file). The recommended approach is to use JSON to configure the application, but provide any secrets as environment variable overrides. The configuration works on a priority basis, with configuration at a higher priority overriding any configuration at a lower priority. From lowest to highest priority, these are: - Variables provided in the application code as defaults - Configuration in the appsettings.json file (provided by the application) - Configuration in the appsettings.Production.json file (provided by the application) - Configuration from environment variables - Configuration from environment variables with the relevant app prefix DataAggregator or GatewayApi - Configuration provided by command line parameters - Configuration in the appsettings.Production.overrides.json file (optionally provided by the user running the application) - Configuration in the JSON file at CustomJsonConfigurationFilePath (optionally provided by the user running the application) At v1, configuration provided in appsettings.Production.overrides.json or CustomJsonConfigurationFilePath overrides the environment variables (https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-6.0#json-configuration-provider) . This may be adjusted in future versions. For now, we’d advise not relying on this priority ordering, and not specifying any values in the JSON documents which should be provided by environment variables. Default configuration for reference can be seen on github in the various JSON files for the Data Aggregator (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/apps/DataAggregator) and Gateway API (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/apps/GatewayApi) . Common Configuration Core API / Full Node Connections Both the Data Aggregator and Gateway API will need to be configured to read from one or more Core APIs. These can be configured to live in the same private network, or can be configured to communicate over the internet, although we advise that the Core API is not designed to be publicly exposed. See the Radix Node Setup (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) guide for further information on running the Radix Nodes. In terms of Network Gateway configuration, the connection details for the Core API/s can be configured in the CoreApiNodes array, each element of the array should configure a single Core API full node, and look like: { "Name": "", "CoreApiAddress": "
", "Enabled": true } The CoreApiAuthorizationHeader property can be added to this configuration object for each node, to use with Basic Authentication (say, provided by the default nginx configuration of a production node). This is likely a secret, so can be provided via an environment variable, eg DataAggregator__Network__CoreApiNodes__0__CoreApiAuthorizationHeader - where the 0 is replaced by the index of the CoreApiNode you wish to configure. The Authorization header value for basic auth is Basic where = base64("USERNAME:PASSWORD"). If absolutely necessary, "DisableCoreApiHttpsCertificateChecks": "true" can be provided in the root configuration (eg DataAggregator__Network__DisableCoreApiHttpsCertificateChecks) to disable certificate checks if self-signed certificates are used. For advanced configuration, the Core API Node object also supports further options in the Data Aggregator (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/apps/DataAggregator) , and some further options in the Gateway API (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/apps/GatewayApi) . Database Connections The Network Gateway and Gateway API are configured to talk to the PostgreSQL database via Npgsql connection strings (https://www.npgsql.org/doc/connection-string-parameters.html) . In each of DataAggregator and GatewayAPI, there are two settings to provide: - ConnectionStrings.NetworkGatewayReadWrite in the JSON, or ConnectionStrings\__NetworkGatewayReadWrite as an environment variable - which is used by the DataAggregator for writing ledger state during sync, and is used for reading and writing data about transactions submitted through the Gateway API. - ConnectionStrings.NetworkGatewayReadOnly in the JSON, or ConnectionStrings__NetworkGatewayReadOnly as an environment variable - which is used for read only queries about ledger state, mostly in the Gateway API. For single database deployments, the NetworkGatewayReadOnly can match the __NetworkGatewayReadWrite. The connection string/s will contain secrets, so it will likely be easier to provide them as environment variables. Data Aggregator Configuration There are many configuration options, which can be seen in code (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/apps/DataAggregator) , but notably the following many commonly wish to be adjusted by other runners of the Data Aggregator: - CommitRequiresNodeQuorumTrustProportion - this configures the proportion of sufficiently synced full nodes which must be seen to agree before committing to the database. Setting to 1 requires all sufficiently synced up nodes to return before committing. - MempoolConfiguration.TrackTransactionsNotSubmittedByThisGateway can be set to false to disable tracking of pending transactions not submitted through the Gateway, which will reduce the resource and data requirements of the Data Aggregator during network congestion. The port used for health checks can be configured using standard Kestrel configuration (https://docs.microsoft.com/en-us/aspnet/core/fundamentals/servers/kestrel/endpoints?view=aspnetcore-6.0) . Gateway API Configuration There are many configuration options, which can be seen in code (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/apps/GatewayApi) , but notably the following many commonly wish to be adjusted by other runners of the Gateway API: - AcceptableLedgerLag.ReadRequestAcceptableDbLedgerLagSeconds and AcceptableLedgerLag.ConstructionRequestsAcceptableDbLedgerLagSeconds configure the threshold before stale data results in the API returning a NotSyncedUpError, if the DataAggregator isn’t sufficiently synced up, and the Database is over this threshold behind consensus. - EnableSwagger - If the service is enabled privately, you can turn on this option to enable an easy web interface for interacting with the API at /swagger The port/address which is used for the Gateway API can be configured using standard Kestrel configuration (https://docs.microsoft.com/en-us/aspnet/core/fundamentals/servers/kestrel/endpoints?view=aspnetcore-6.0) . ## Requirements URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/network-gateway-setup/custom-setup/requirements Updated: 2026-02-18 Summary: These system and resiliency requirements are suggestions, for when deploying these services to individual kubernetes pods or equivalent, and may be tuned to the Run > Network Gateway > Setup > Custom Setup > Requirements — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/requirements.md) Network Gateway - Service Requirements These system and resiliency requirements are suggestions, for when deploying these services to individual kubernetes pods or equivalent, and may be tuned to the needs of an individual system runner. Core API Model vCPU Memory (GB) Storage (GB) Network Bandwidth (Gbps) Operating System c5.2xlarge 8 16 Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) .You should initially provision 250 GB of SSD space Up to 10 Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) You should initially provision 250 GB of SSD space Data Aggregator Data Aggregator is stateless, with data stored in the PostgreSQL database. Only a single Data Aggregator should be (for v1) run at any given time. Having two running will cause many failed database writes as they will step on each other. The Data Aggregator should be configured to restart on failure. There is a health check endpoint available, discussed in the monitoring guide (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/maintenance-1/monitoring.md) . The Data Aggregator depends on migrations executed by Database Migrations. Those must be executed before Data Aggregator gets deployed. Suggested system requirements: - CPUs: 2 - Memory: at least 4 GiB PostgreSQL Database Suggested system requirements: - 1 write replicas - 2 read replicas - CPUs: 2 - Memory: at least 32 GiB - Disk size: at least 512 GiB The database should be deployed resiliently. A managed service such as AWS Aurora is ideal for this. Read replicas can handle the main query load from the Gateway API; with the read/write primary configured for the Data Aggregator, and for pending transaction status reads and writes from the Gateway API. Gateway API Gateway APIs are stateless, and read data from the PostgreSQL database. 1 or more Gateway APIs can be configured to run against the same database, and may be placed behind a load balancer. The size of the Gateway API and number of Gateway APIs you require will depend on the load profile you expect. The following is a suggested requirement for a Gateway API server which is set to auto-scale: - CPUs: 1 - Memory: at least 1 GiB ## Custom Setup URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/network-gateway-setup/custom-setup/custom-setup Updated: 2026-02-18 Summary: Before starting with custom setup, read the Network Gateway architecture and setup overview(../README.md). Run > Network Gateway > Setup > Custom Setup — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/README.md) Network Gateway - Custom Setup Overview Before starting with custom setup, read the Network Gateway architecture and setup overview (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/README.md) . Overview For high-traffic production systems, these four components (full node/Core API, Data Aggregator, PostgreSQL DB and Gateway API) will need to be set up in a resilient configuration, and monitored. Depending on your infrastructure provider, there are many ways to deploy these components - so in these docs, we’ll stick to the recommended requirements for a production setup (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/requirements.md) , and explain how the services can be configured (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/configuration.md) and monitored (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/maintenance-1/monitoring.md) so that this can be adapted to your infrastructure requirements. Radix Publishing Ltd provides the Data Aggregator and Gateway APIs as both: - Docker containers (see Docker Hub: Data Aggregator (https://hub.docker.com/r/radixdlt/babylon-ng-data-aggregator) , Gateway API (https://hub.docker.com/r/radixdlt/babylon-ng-gateway-api) , Database Migration (https://hub.docker.com/r/radixdlt/babylon-ng-database-migrations) ) - Binaries (see Network Gateway Github Releases (https://github.com/radixdlt/babylon-gateway/releases) ) You can choose to configure / run these as best fits your needs. The Docker containers are ready to be configured and run out of the box, and are recommended as the easiest deployment option. If you choose to run the binaries themselves (say, via systemd), you will need to be set up the server to run an ASP .NET Core app, running in the .NET 7 runtime, and configured with appropriate resiliency, to eg restart on failure. If deploying the Grafana / Monitoring stack (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-setting-up-grafana.md) , we would always recommend this is deployed onto a separate host. ## Setup with CLI URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/network-gateway-setup/setup-with-cli Updated: 2026-02-18 Summary: Before starting on this, please read the Network Gateway Setup Overview(README.md) to decide what setup you require. Run > Network Gateway > Setup > Setup with CLI — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/setup-with-cli.md) Install gateway and core using CLI Before starting on this, please read the Network Gateway Setup Overview (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/README.md) to decide what setup you require. Introduction The babylonnode CLI tool supports installing a Network Gateway, as well as a full node (AKA core). All services will be setup to run in docker, except PostgreSQL, which will be setup to run using systemd. This is because, in our experience, running a PostgreSQL docker image has a tendency to hang the docker daemon or make it unresponsive. This guide will take you through how to setup the full stack with docker and the babylonnode CLI. 1. Provision the host machines The first step is to work out which setup and host machine requirements best suite your needs using the Network Gateway Setup Overview (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/README.md) . Once you have provisioned the required host machines, continue the instructions below. 2. Prepare the host machines This is required only to be run during first time setup on a new host machine. On each host machine, first install the CLI (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/setup-with-cli.md) . Then install the dependencies: babylonnode docker dependencies Once the process has completed, you will be asked to log out of your ssh bash session and log back in. 3. Configure each host machine You should only need to do this setup the first time you setup on a host machine. If you’re upgrading, skip to the next step. First, for each host, you are lead through an interactive prompt to create a config.yaml file, which by default is stored at /home/ubuntu/babylon-node-config/config.yaml. This config file is then used for future installs and upgrades, to avoid having to remember your settings later. KeyStore and PostgreSQL passwords which are created/entered during the config command are written to the generated config.yaml file. Single Host Running both full node and Gateway database on the same host is not recommended as it creates IO contention, resulting in a slower ledger sync. Instead, it is recommended to run the database on a separate host from the full node, using one of the other options. This process will configure the following services onto the single host: - [CORE, STATEFUL] A full node running with docker compose, with its associated ledger store and key store. - [GATEWAY, STATELESS] A Data Aggregator running with docker compose. - [GATEWAY, STATEFUL] A PostgreSQL database, run with systemd. - [GATEWAY, STATELESS] A Gateway API running with docker compose. - [CORE & GATEWAY, STATELESS] An Nginx reverse proxy running with docker compose, providing access to the Core, System and Gateway APIs. This setup can be achieved by creating a config file using the combined simple CORE GATEWAY mode: babylonnode docker config -m CORE GATEWAY Single Host with separate DB This process will configure the following services onto the single host: - [CORE, STATEFUL] A full node running with docker compose, with its associated ledger store and key store. - [GATEWAY, STATELESS] A Data Aggregator running with docker compose. - [GATEWAY, STATELESS] A Gateway API running with docker compose. - [CORE & GATEWAY, STATELESS] An Nginx reverse proxy running with docker compose, providing access to the Core, System and Gateway APIs. It is assumed that you already have a PostgreSQL database with sufficient specification (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/configuration.md) - either in a separate managed service or on another host; that you can connect to from the host we’re configuring. You will need to provide connection details to this database as part of the config process. This setup can be achieved by creating a config file using the DETAILED mode: babylonnode docker config -m DETAILED Dual Host This process will configure the following services onto the first host: - [CORE, STATEFUL] A full node running with docker compose, with its associated ledger store and key store. - [CORE, STATELESS] An Nginx reverse proxy running with docker compose, providing access to the Core and System APIs. This process will configure the following services onto the second host: - [GATEWAY, STATELESS] A Data Aggregator running with docker compose. - [GATEWAY, STATEFUL] A PostgreSQL database, run with systemd. - [GATEWAY, STATELESS] A Gateway API running with docker compose. - [GATEWAY, STATELESS] An Nginx reverse proxy running with docker compose, providing access to the Gateway API. This setup can be achieved by creating a config file using the DETAILED mode, on each host. babylonnode docker config -m DETAILED Dual Host with separate DB This process will configure the following services onto the first host: - [CORE, STATEFUL] A full node running with docker compose, with its associated ledger store and key store. - [CORE, STATELESS] An Nginx reverse proxy running with docker compose, providing access to the Core and System APIs. This process will configure the following services onto the second host: - [GATEWAY, STATELESS] A Data Aggregator running with docker compose. - [GATEWAY, STATELESS] A Gateway API running with docker compose. - [GATEWAY, STATELESS] An Nginx reverse proxy running with docker compose, providing access to the Gateway API. It is assumed that you already have a PostgreSQL database with sufficient specification (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/configuration.md) - either in a separate managed service or on another host; that you can connect to from the second host. You will need to provide connection details to this database as part of the config process. This setup can be achieved by creating a config file using the DETAILED mode, on each host: babylonnode docker config -m DETAILED 4. Install or update on each host machine Once the configuration file has been created, run the following on each host: Docker mode babylonnode docker install - Optional - Use parameter -f  if the config file is different from the /home/ubunu/babylon-node-config/config.yaml. - Optional - Use parameter -a or --autoapprove to run this command without any prompts. It is only recommended for automation purpose. - Use parameter -u or --update to deploy latest versions of software. The configuration file includes a version number. If the CLI is updated to use a new configuration version, it may prompt you to manually migrate the configuration file to the new version. Instructions to perform this manual migration will live in the release notes of the CLI. 5. Set passwords for nginx reverse proxy The full node’s Core API and System API; and the Network Gateway’s Gateway API, are both protected by nginx. If Core and Gateway are on the same host, they share a single nginx, otherwise, the full node and Gateway API will have separate nginx instances on each host. Nginx runs bound to http on port 80, and https on port 443 - note that the certificates it exposes for https are self-signed. When you run setup, you can select which endpoints are protected by HTTP basic auth. A number of users are setup for you, with each user protecting different sets of endpoints: - gateway - For Gateway API endpoints - admin - For low-risk node endpoints, not designed to be public-facing (Core/System API) - superadmin - For high-risk node endpoints, that should under no circumstances be exposed publicly (Core/System API) - metrics - For the node, Data Aggregator and Gateway API metrics endpoints For more details on these endpoints, see the api specification docs (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) . By default, a random password is generated for each user and output by the CLI, but this can be updated using the following CLI commands. Ensure you update the relevant password on each host that runs nginx: babylonnode auth set-gateway-password --setupmode DOCKER babylonnode auth set-admin-password --setupmode DOCKER babylonnode auth set-superadmin-password --setupmode DOCKER babylonnode auth set-metrics-password --setupmode DOCKER 6. Make sure the gateway is running The gateway status can be checked by running the below command. curl --request GET --insecure --user "gateway:$NGINX_GATEWAY_PASSWORD" https://localhost/gateway It should return a response like shown below: { "network_identifier": { "network": "mainnet" // #1 }, "gateway_api": { "version": "1.1.7", "open_api_schema_version": "1.1.6" }, "ledger_state": { "version": 54523000, // #2 "timestamp": "2022-02-09T05:12:20.749Z", "epoch": 5389, "round": 9987 }, "target_ledger_state": { "version": 89039430 // #3 } } The key values in the above response are: - gateway.network_identifier.network would be pointing to the right network. For mainnet, this should be mainnet - ledger_state.version is the ledger transaction number that your GATEWAY has synced up to. If this value is close to target_ledger_state.version then the GATEWAY is synced up. - target_ledger_state.version is the latest ledger transaction number that the connected CORE node can see on the network. 7. Setting up Monitoring Monitoring can be set up on the same host as an existing setup, or on another host. It also includes a database, so can cause increased IO contention. For that reason, we’d recommend running it on a separate host, with connections to the other hosts. Please see this guide for setting up the Grafana Monitoring stack (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-setting-up-grafana.md) . ## Setup URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/network-gateway-setup/network-gateway-setup Updated: 2026-02-18 Summary: A Network Gateway deployment consists of the following services: Run > Network Gateway > Setup — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/README.md) Network Gateway Setup Overview Architecture Overview A Network Gateway deployment consists of the following services: - One or more Radix Nodes, in their Full Node configuration, with the transaction endpoint of the Core API enabled. - A single[1] (Network Gateway) Data Aggregator. - A PostgreSQL Database - One or more (Network Gateway) Gateway APIs These services may be spread between one or more hosts, depending on the deployment approach. Choosing the right setup For any setup used in production, we would recommend full stack redundancy - running two stacks in parallel - a production stack, and a backup stack. The backup stack can be swapped to if the production stack hits any issues. At update time, the backup stack can be updated first as a trial, and then swapped out with the production stack, which can then be upgraded. Depending on your use case, we recommend different setups: Local development/testing If you just want to try running a stack, without needing it fully synced, you may run a full stack locally following the Local Development deployment approach tab below. This may cause your machine to run slowly, and will take a long time to fully sync. It runs Network Gateway services and PostgreSQL database as docker containers. If you require working against a fully synced Gateway, we recommend creating a cloud hosted deployment of the whole stack, and developing against that. See the Internal-facing, low-medium traffic tab for more details. Internal-facing, low-medium traffic For low-medium traffic levels (10-50 Gateway API requests per second, depending on database host and request type), deploying a single copy of each service should be sufficient (with a whole parallel stack for production redundancy). - If sluggish system performance during sync is acceptable, you can attempt running a full stack on a single, well-provisioned host. For host requirements, see the Single Host tab below. - The first optimization we’d recommend is having a separate host for the Gateway Database, ideally a managed PostgreSQL database service. This reduces IO contention between the full node and the Gateway. For host requirements, see the Single Host with separate DB tab below. - You can also achieve this by running the full node on one host, and the rest of the Gateway stack (including the PostgreSQL database) on a second host. For host requirements, see the Dual Host tab below. - For best performance, we’d recommend deploying the full node onto one host, the stateless Gateway services onto a second host, and use an additional dedicated database host, or managed database service. For host requirements, see Dual Host with separate DB tab below. For the deployment approach, you can consider the CLI (recommended) or Custom options, in tabs below. High traffic For high traffic (> 50 Gateway API requests per second), scalable or public-facing loads, we recommend a setup using kubernetes or any other high availability scalable orchestration. This would include: - Two full nodes - A single data aggregator - A dedicated host or replica set for the PostgreSQL database - An auto-scaling group of Gateway APIs behind a load balancer For host requirements of the pods, see the Kubernetes tab below. For the deployment approach, see the Custom option, in a tab below. Host Requirements Whatever your setup, we’d also recommend deploying the Grafana / Monitoring stack (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-setting-up-grafana.md) , which should be deployed onto a separate host. Single Host - This setup runs both CORE and GATEWAY on the same machine. This setup is not recommended as it creates IO contention, resulting in a slower ledger sync. Instead, it is recommended to use another deployment option. - For optimal performance, we’d recommend running PostgreSQL using systemd or using a separate database on a dedicated host. If run in a docker image, PostgreSQL has a tendency to consume all available resources, and may make the docker daemon sluggish or unresponsive. This is particularly bad whilst syncing. Model vCPU Memory (GB) Storage(GB) Network Bandwidth (Gbps) Operating System c5.4xlarge 16 32 Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) .You should initially provision 500 GB of SSD space Up to 10 Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) Single Host with separate DB CORE and GATEWAY: Model vCPU Memory (GB) Storage(GB) Network Bandwidth (Gbps) Operating System c5.2xlarge 8 16 Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) .You should initially provision 250 GB of SSD space Up to 10 Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) You should initially provision 250 GB of SSD space GATEWAY DB: See the dedicated Network Gateway service requirements (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/requirements.md) for the suggested Gateway DB requirements. Dual Host For optimal performance, we’d recommend running PostgreSQL using systemd or using a separate database on a dedicated host. If run in a docker image, PostgreSQL has a tendency to consume all available resources, and may make the docker daemon sluggish or unresponsive. This is particularly bad whilst syncing. CORE: Model vCPU Memory (GB) Storage(GB) Network Bandwidth (Gbps) Operating System c5.2xlarge 8 16 Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) .You should initially provision 250 GB of SSD space Up to 10 Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) You should initially provision 250 GB of SSD space GATEWAY: Model vCPU Memory (GB) Storage(GB) Network Bandwidth (Gbps) Operating System c5.4xlarge 16 32 Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) .You should initially provision 600 GB of SSD space Up to 10 Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) Dual Host with separate DB CORE: Model vCPU Memory (GB) Storage(GB) Network Bandwidth (Gbps) Operating System c5.2xlarge 8 16 Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) .You should initially provision 250 GB of SSD space Up to 10 Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) You should initially provision 250 GB of SSD space GATEWAY: See the dedicated Network Gateway service requirements (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/requirements.md) . You will want to combine Data Aggregator and Gateway API requirements, as these will be deployed onto the same host. Both of these services are stateless. GATEWAY DB: See the dedicated Network Gateway service requirements (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/requirements.md) for the suggested Gateway DB requirements. Kubernetes See the dedicated Network Gateway service requirements (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/requirements.md) for requirements by service / pod. Deployment Approach Local Development If you are looking to run a local system for development of integrations with a Network Gateway, you can see the documentation on running a toy local environment (https://github.com/radixdlt/babylon-gateway/tree/v1.0.0/deployment) which runs the full stack in docker compose on a single machine. This setup isn’t designed for production workloads, and so defaults to running against stokenet. CLI This runs the node and gateway on Ubuntu hosts, with the option of connecting to an external PostgreSQL Database. See setting up the network gateway using the CLI (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/setup-with-cli.md) for more information. The CLI configures the Gateway with sensible default configuration. If you’d like to have more ability to configure your Gateway, you may need to instead deploy and configure a custom network gateway. (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/README.md) Custom See setting up a custom network gateway (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/custom-setup/README.md) . 1. For v1, only a single Data Aggregator is supported. In future versions, we hope to add support for allowing deployment of multiple Data Aggregators, which will configure themselves as a primary and 0 or more secondaries, ready for hot failover if required ## Network Gateway URL: https://radix.wiki/developers/legacy-docs/run/network-gateway/network-gateway Updated: 2026-02-18 Summary: The Network Gateway (GitHub) is designed to be the Radix-run publicly exposed gateway into the Babylon Radix network. Run > Network Gateway — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/README.md) An Introduction to the Network Gateway What is the Network Gateway? The Network Gateway ( GitHub (https://github.com/radixdlt/babylon-gateway) ) is designed to be the Radix-run publicly exposed gateway into the Babylon Radix network. The system is in three main parts: The Database Migration sets up PostgreSQL database and applies schema migrations if necessary. The Data Aggregator reads from the Core API of one or more full nodes, ingests from their Core API transaction stream endpoint, and commits transactions to a PostgreSQL database. It also handles the resubmission of submitted transactions, where relevant. The Gateway API provides the public API for Wallets and Explorers. It handles read queries using the database, and proxies transaction preview and submission requests to the Core API of one or more full nodes. Running a Network Gateway or similar service If you wish to perform custom queries on ledger data, or integrate an application against the Radix Network, it would be best to run your own system to compile data from Radix Nodes. The Network Gateway code forms a reference implementation, and can be deployed directly, forked, or used as a reference to build other systems which integrate with the Core API. If you wish to deploy a Network Gateway directly, we provide some guidance in the Setup (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/network-gateway-setup/README.md) article. ## Register as a Validator URL: https://radix.wiki/developers/legacy-docs/run/node/workbench/trash-to-remove/docker-register-as-a-validator Updated: 2026-02-18 Summary: A Validator node is a Full Node that has registered with the Radix network to receive delegated stake and potentially be selected to participate in network cons Run > Node > Workbench > Trash - to remove > Register as a Validator — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/workbench/trash-to-remove/docker-register-as-a-validator.md) https://docs-babylon.radixdlt.com/main/node-and-gateway/docker-register-as-validator.html (https://docs-babylon.radixdlt.com/main/node-and-gateway/docker-register-as-validator.html) Registering a Validator Node Introduction A Validator node is a Full Node that has registered with the Radix network to receive delegated stake and potentially be selected to participate in network consensus. Validator nodes provide the critical infrastructure of the Radix network and may receive special incentive rewards as a result. However attempting to become one of the network’s 100 validator nodes is not a decision to be taken lightly, requiring commitment to high reliability operation and engagement with the Radix community. Registration as a validator node alone does not guarantee participation in consensus or that you will receive incentive rewards. Click here for general information about staking and validator participation. (https://learn.radixdlt.com/categories/staking-on-radix?_gl=1*zd2b23*_ga*ODM5MDk4MjgxLjE2OTU4Nzg2Njg.*_ga_MZBXX3HP5Q*MTY5ODI1MzAxOC4xMy4xLjE2OTgyNTg5ODAuNjAuMC4w) Once running, a validator node offers two interface endpoints on a private port: The /core endpoint can be used to conduct transactions from the node’s account, including validator configuration, registration, and de-registration. The node’s account must hold XRD tokens to pay for network fees on these transactions. The /system endpoint can be used to query aspects of the node and network such as current version, current peers, etc. Prerequisites To register a validator node, you need a Radix full node installed, running, and syncing with the network. If you haven’t yet set one, then please run through one of these guides: Installing and Running a Node as a Docker Instance (docker-compose-mode) Installing and Running a Node using SystemD (systemd-mode) Register the node as a validator Please refer to Registering a Validator Node (registering-a-validator-node) ## Trash - to remove URL: https://radix.wiki/developers/legacy-docs/run/node/workbench/trash-to-remove/trash-to-remove Updated: 2026-02-18 Summary: Legacy documentation: Trash - to remove Run > Node > Workbench > Trash - to remove — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/workbench/trash-to-remove/README.md) ## Workbench URL: https://radix.wiki/developers/legacy-docs/run/node/workbench/workbench Updated: 2026-02-18 Summary: Legacy documentation: Workbench Run > Node > Workbench — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/workbench/README.md) ## Setting up a Grafana dashboard URL: https://radix.wiki/developers/legacy-docs/run/node/node-maintenance-and-administration/node-setting-up-grafana Updated: 2026-02-18 Summary: Radix uses Prometheus and Grafana to provide nodes and gateways with a real-time monitoring and alert system. Prometheus reads from metrics API endpoi Run > Node > Maintenance and Administration > Setting up a Grafana dashboard — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-setting-up-grafana.md) Introduction Radix uses Prometheus (https://prometheus.io/) and Grafana (https://grafana.com/) to provide nodes and gateways with a real-time monitoring and alert system. Prometheus reads from metrics API endpoints of the node and gateway, and stores it as time series data. Grafana reads this time series data and uses it to build monitoring dashboards for displaying node information and alerts in real-time. The Prometheus/Grafana installation has been packaged as part of the babylonnode script. This will allow node runners to get a monitoring dashboard up and running by simply running the script and answering a few questions. You can still setup a Prometheus/Grafana dashboard if you haven’t used the babylonnode script to setup your node, but this guide doesn’t cover it. Prerequisites It probably goes without saying that before you can install the node monitoring software, you must have a node and/or gateway up and running. If you haven’t, then run through one of our guides which will show you how to do it. Start by reading the Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) to work out which service/s to run. Monitoring can be set up on the same host as an existing setup, or on another host. Monitoring also includes a database, so can cause increased IO contention. For that reason, we’d recommend running it on a separate host, with connections to the other host/s. Before you start, you must have the babylonnode CLI (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-cli.md) installed on the host you intend to install monitoring. If running monitoring on a separate host, we’d recommend the following (or equivalent) specs: Model vCPU Memory Storage Network Bandwidth Operating System t3.medium 2 4 GB Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) .You should initially provision 60 GB of SSD space At least 100 Mbps Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) 1. Set up the Nginx Metrics password In order to make sure the node/gateway is secure as possible, the metrics API endpoint is configured with a separate password in the nginx layer. On each host with a node or gateway, we need to set up the username/password for this endpoint. Run one of the following babylonnode commands (depending on your installation type) to set up the endpoint’s authorization. Docker If PostgreSQL is running locally, this command may not respond. In that case, it is advised to stop the deployed software using command babylonnode docker stop and then run this command. babylonnode auth set-metrics-password -m DOCKER systemd - If you are running a systemd node then you must first change to the radixdlt user: sudo su - radixdlt Then run the below command to setup password babylonnode auth set-metrics-password -m SYSTEMD 2. Set up the monitoring service This step will be performed on the host you wish to install monitoring. The monitoring exposes a user dashboard on port 3000 on your server, so make sure that this port is open and available for Grafana to use. The monitoring setup is split into two commands: - the config command is used to create a config file for monitoring. This allows to persist configuration that you may customise and does not get overwritten during an update - the install command is used to install the monitoring using the config file 2.1. Create config file for monitoring setup A separate config.yaml file is used for monitoring than the one created for CORE and GATEWAY. By default, it lives at ~/monitoring/config.yaml There are easy commands to configure monitoring with a Core (on the same host as monitoring, or another host), or with a Core and Gateway (on the same host as monitoring, or on a single other host). For other use cases, check out the Advanced setup. Core (same host) babylonnode monitoring config \ -m MONITOR_CORE \ -cm [metrics_password] - Setup mode selected here as MONITOR_CORE. This only setups config for monitoring core node. - The password for the metrics user that was configured in the previous step. The username for the metrics user is typically metrics, but can be set to something else if the host was configured using the DETAILED config mode. Core (other host) NODE_HOST_IP_OR_NAME=: \ babylonnode monitoring config \ -m MONITOR_CORE \ -cm [metrics_password] - The host of the Core, and the optional :port to its metrics endpoint (if nonstandard) - Setup mode selected here as MONITOR_CORE. This only setups config for monitoring core node. - The password for the metrics user that was configured in the previous step. The username for the metrics user is typically metrics, but can be set to something else if the host was configured using the DETAILED config mode. Core & Gateway (same host) babylonnode monitoring config \ -m MONITOR_CORE MONITOR_GATEWAY \ -cm [metrics_password] \ -gm [metrics_password] \ -am [metrics_password] - The setup mode is selected here as MONITOR_CORE and MONITOR_GATEWAY. This creates config for monitoring both the core full node and gateway. - The password for the metrics user that was configured in the previous step. This is for the core full node where nginx protects path /prometheus/metrics. - The password for the metrics user that was configured in the previous step. This is for the network gateway’s gateway api where nginx protects path /gateway/metrics. - The password for the metrics user that was configured in the previous step. This is for the network gateway’s data aggregator where nginx protects path /aggregator/metrics. The parameters cm , gm , am are separate, although in many cases they will share the same password. The options are separate to allow running the services on different hosts, where separate passwords may be used. The username for the metrics user is typically metrics, but can be set to something else if the host was configured using the DETAILED config mode. In that case, see the Advanced setup. Core & Gateway (single other host) NODE_HOST_IP_OR_NAME=: \ babylonnode monitoring config \ -m MONITOR_CORE MONITOR_GATEWAY \ -cm [metrics_password] \ -gm [metrics_password] \ -am [metrics_password] - The host of the Core and Gateway, and the optional :port to its metrics endpoints (if nonstandard) - The setup mode is selected here as MONITOR_CORE and MONITOR_GATEWAY. This creates config for monitoring both the core full node and gateway. - The password for the metrics user that was configured in the previous step. This is for the core full node where nginx protects path /prometheus/metrics. - The password for the metrics user that was configured in the previous step. This is for the network gateway’s gateway api where nginx protects path /gateway/metrics. - The password for the metrics user that was configured in the previous step. This is for the network gateway’s data aggregator where nginx protects path /aggregator/metrics. The parameters cm , gm , am are separate, although in many cases they will share the same password. The options are separate to allow running the services on different hosts, where separate passwords may be used. The username for the metrics user is typically metrics, but can be set to something else if the host was configured using the DETAILED config mode. In that case, see the Advanced setup. Advanced babylonnode monitoring config \ -m DETAILED - Setup mode selected here as DETAILED. This will walkthrough you with a series of questions to setup config file appropriately. Alternatively you can manually edit the config file that is created from other commands to point to the right values of the monitored host. 2.2. Run setup to install and run monitoring When the monitoring has been configured, you can now run the script to set up monitoring: babylonnode monitoring install And that’s pretty much it. 3. Accessing the dashboard. From the monitoring host, you can view your node’s dashboard remotely using any browser, using this URL pattern: http://:3000/d/radix_node_dashboard/radix-node-dashboard?orgId=1&refresh=5s The monitoring-host-ip is the external IP address of your host where monitoring is running. Grafana will display a page asking for your username and password. the Grafana welcome screen Since this is the first time you’ve run the monitor, enter `admin` for the username and `admin` again for the password. Grafana will now display another dialog asking you to change the password for the `admin` user. pick a new password You will now see a blank page with a somewhat discouraging `Not Found` message at the top. Click the Search icon (the magnifying glass). opening window Navigate to Dashboards > Radix Node Dashboard: selecting the dashboard Grafana will now the example dashboard: The Grafana Dashboard 4. Accessing the metrics endpoint Now that the dashboard is up and running, you may want to add new elements yourself, or use the data from the node to build other applications for monitoring and gathering information. The same data that Grafana uses to build its screen is available on the metrics endpoint/s: # Core node: curl -X -k GET --location "https://CORE_NODE_HOST:METRICS_PORT/prometheus/metrics" -H "Content-Type: application/json" --basic --user metrics:nginx-password # Network Gateway - Data Aggregator curl -X -k GET --location "https://DATA_AGGREGATOR_HOST:METRICS_PORT/aggregator/metrics" -H "Content-Type: application/json" --basic --user metrics:nginx-password # Network Gateway - Gateway API curl -X -k GET --location "https://GATEWAY_API_HOST:METRICS_PORT/gateway/metrics" -H "Content-Type: application/json" --basic --user metrics:nginx-password Where nginx-password is the password you set up earlier. 5. Shut down Node monitoring To shut down the monitor on the monitoring host, use the following command: babylonnode monitoring stop More information on using babylonnode You can use babylonnode to carry out a variety of administrative tasks. For a full reference on the commands the babylonnode CLI offers, please see the documentation on the CLI GitHub repository (https://github.com/radixdlt/babylon-nodecli/blob/main/docs/command_reference.adoc) . If you have any questions or run into problems, you can get support from the Radix team members and our fantastic community on our Discord server. ## Optimizing node's performance URL: https://radix.wiki/developers/legacy-docs/run/node/node-maintenance-and-administration/node-optimizing-performance Updated: 2026-02-18 Summary: Once your node is running and connected to the network, you can optimise it for better performance and more efficient use of system resources. Of course, these Run > Node > Maintenance and Administration > Optimizing node's performance — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-optimizing-performance.md) Introduction Once your node is running and connected to the network, you can optimise it for better performance and more efficient use of system resources. Of course, these changes will depend on your hardware setup, the system resources available, and what kind of node you’re running: a full node or a validator node. What we’re presenting here is a guide based on the experience of the Radix engineers and DevOps. Your mileage, of course, may vary. We’re going to change the system resource allocations by changing the parameters in the ulimit configuration, but before we get going, let’s take a look at the configuration settings we’re interested in: Parameter Description Setting nofile The number of files the host OS can keep open at once. 65536 nproc The number of processes the OS can run simultaneously. 65536 memlock (soft) The maximum locked-in address space a particular user can allocate. Once allocated, the pages stay in physical memory, which speeds up operations on the ledger database. The soft limit applies to the owner of the process, which will be radixdlt in our case. unlimited memlock (hard) The maximum locked-in address space that can allocated on the OS as a whole. This value can only be set by the root user. Any soft memlock cannot exceed the value of the hard memlock. unlimited If you’re running other applications on the same server as your node (which is highly inadvisable) then setting memlock (soft) to unlimited will severely impact the performance of the other applications. These changes can be made manually, but the easiest way to do it is through the infinitely versatile babylonnode script. Prerequisites Obviously, you’ll need to have the babylonnode CLI installed before you optimise the node. It’s a good idea to download the babylonnode script, even if you have already installed it; this will ensure you’re running the latest version. For guidance on installing the babylonnode CLI, take a look at Installing the babylonnode CLI (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-cli.md) . 1. Set up the optimiser - Execute the following command: babylonnode optimise-node The script will now download the support files - Log out of the shell then log in again. 2. Run the optimiser - Once the optimiser has installed, and you’ve logged back into the shell, run the same command to carry out the optimisations: babylonnode optimise-node - The script will now ask if you’d like to update the ulimit settings. Press Y to update the settings to match the ones described above. - The script will now ask if you’d like to change the swap space. We’re recommending a swap file size of 8 GB (regardless of node type), so enter 8G. - Log out of your session to update the settings, then log in again. 3. Check your settings To check your settings, execute the following command: ulimit -a The resulting table should match the settings presented above. core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 30953 max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 65536 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited You can also check the swap size using this command: swapon --show which gives the swap size, along with the amount in use NAME TYPE SIZE USED PRIO /swapfile file 8G 25M -2 More information on using babylonnode You can use babylonnode to carry out a variety of administrative tasks. For a full reference on the commands the babylonnode CLI offers, please see the documentation on the CLI GitHub repository (https://github.com/radixdlt/babylon-nodecli/blob/main/docs/command_reference.adoc) . If you have any questions or run into problems, you can get support from the Radix team members and our fantastic community on our Discord server. ## Registering as a Validator URL: https://radix.wiki/developers/legacy-docs/run/node/node-maintenance-and-administration/node-registering-as-a-validator Updated: 2026-02-18 Summary: Make sure you’ve read the Node Introduction(../README.md) and have a good understanding of what a validator node is before you decide to register. Run > Node > Maintenance and Administration > Registering as a Validator — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-registering-as-a-validator.md) Introduction Make sure you’ve read the Node Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/README.md) and have a good understanding of what a validator node is before you decide to register. On Babylon, a "Validator" exists on the ledger independently from its currently configured public key (and associated node). This means that you can change the public key associated with your validator! You will manage your "Validator Component" using an "Owner Badge", which can be stored/protected in an account. This guide will take you through setting up an account in your wallet for this purpose. In practice, you may wish to use a "shared ownership" account, which is configured to allow access upon presenting badges, which you can hand out to your validator co-owners. Validator nodes provide the critical infrastructure of the Radix network and owners of active validators receive tips and a share of the transaction fees to offset this cost. However attempting to become one of the network’s 100 validator nodes is not a decision to be taken lightly, requiring commitment to high reliability operation and engagement with the Radix community. Registration as a validator node alone does not guarantee participation in consensus or that you will receive incentive rewards. A validator node is a node that has configured to act on behalf of an validator component which has been created on the Radix ledger. The validator component can be configured to receive delegated stake and potentially be selected to participate in network consensus. It is also configured with a single public key. At the start of each epoch, the validator components which are currently registered are ordered by stake descending, and the top 100 are selected to form the validator set for the next epoch. A snapshot of each of their validator adddress, current stake and public key is taken, and this forms the active validator set for this epoch. In the consensus layer, a node can represent the validator if it is configured with its validator address, and is using a key pair matching the key pair set in the validator component at the start of the last epoch. Click here for general information about staking and validator participation. (https://learn.radixdlt.com/categories/staking-on-radix?_gl=1*lyznnq*_ga*ODM5MDk4MjgxLjE2OTU4Nzg2Njg.*_ga_MZBXX3HP5Q*MTY5ODQxNjQyOC4xOC4xLjE2OTg0MTY5NzQuNy4wLjA.) Prerequisites - You should have completed setting up your node, and have it running. Follow our Node Setup Guide (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) if you haven’t done so yet. - You should have set up a Babylon Radix Wallet (https://www.radixdlt.com/wallet) . 1. Get your wallet connected and an account set up with XRD - Go to the Dashboard [ Stokenet (/contents/tech/releases/stokenet) | Mainnet (https://dashboard.radixdlt.com/) ], and connect your wallet using the connect button. - Ensure that you have some XRD for the transaction fees in your wallet (on Stokenet you can claim some through the wallet, see the Babylon Radix Wallet (https://wallet.radixdlt.com/) set-up guide) - Click on the Connect button, and make a note of the referenced account address. This will be your in the following steps. 2. Gather your node public key Query your node’s System API to retrieve the public key that’s associated with it (in a hex format). Using babylonnode CLI (Guided Setup Mode) babylonnode api system identity Using curl (Manual Setup Mode) curl http://localhost:3334/system/identity You will receive a response like this: { "public_key_hex": "...", "node_address": "...", "node_uri": "...", "node_name": "...", "node_id": "...", "validator_name": "...", "consensus_status": "NOT_CONFIGURED_AS_VALIDATOR" } Take a note of the public_key_hex. This will be your in the following steps. 3. Create your validator component Go to “Send Raw Transaction” on the console [ Stokenet (/contents/tech/releases/stokenet) | Mainnet (https://console.radixdlt.com/transaction-manifest) ]. The following transaction will create your validator entity, which also creates a validator owner badge, which gets deposited to your account. Keep this validator badge safe! This validator badge will be used for controlling your validator. The created validator badge will be a non-fungible, under the native "Validator Owner badge" resource, with a local id being the bytes of your validator address. Copy in the following manifest, replacing the placeholders and with their values from the previous steps: Stokenet (Testnet) # Creation of a validator entity will cost a certain amount of XRD (equating to ~100USD) # So first, we withdraw enough XRD from our account to cover the fee, and then store it in a bucket. CALL_METHOD Address("") "withdraw" Address("resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc") Decimal("2000"); TAKE_FROM_WORKTOP Address("resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc") Decimal("2000") Bucket("validator_creation_fee"); # We then create the validator. # This will return us an owner badge, and any excess change after paying for the fee. CREATE_VALIDATOR Bytes("") # The following argument is the "Fee factor" - a decimal between 0 and 1 # which describes the proportion of the emissions that the owner will take. # Unlike Olympia, this is expressed as a decimal proportion, not as basis points. Decimal("0") Bucket("validator_creation_fee"); # And finally, we deposit the owner badge and any change from the validator creation # back into our account CALL_METHOD Address("") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") None; Mainnet # Creation of a validator entity will cost a certain amount of XRD (equating to ~100USD) # So first, we withdraw enough XRD from our account to cover the fee, and then store it in a bucket. CALL_METHOD Address("") "withdraw" Address("resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd") Decimal("2000"); TAKE_FROM_WORKTOP Address("resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd") Decimal("2000") Bucket("validator_creation_fee"); # We then create the validator. # This will return us an owner badge, and any excess change after paying for the fee. CREATE_VALIDATOR Bytes("") # The following argument is the "Fee factor" - a decimal between 0 and 1 # which describes the proportion of the emissions that the owner will take. # Unlike Olympia, this is expressed as a decimal proportion, not as basis points. Decimal("0") Bucket("validator_creation_fee"); # And finally, we deposit the owner badge and any change from the validator creation # back into our account CALL_METHOD Address("") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") None; Submit this transaction, approve it in the wallet, and go to the results page in the dashboard (e.g. following the link from the Connect button). For the next steps, you will need the following: - Find your by taking a note of the validator address under "CREATED ENTITIES" in the transaction results page for the above transaction. Then, go to the “Network staking” page of the dashboard [ Stokenet (/contents/tech/releases/stokenet)  |  Mainnet (https://dashboard.radixdlt.com/network-staking) ] and locate your validator, and click on it: - Find your by copying the value in the POOL_UNIT metadata. - Find your by copying the value in the CLAIM_NFT metadata. - Find your by copying the value in the OWNER_BADGE metadata. 4. Reconfigure your node to identify as this validator 4.1. If set-up using the babylonnode CLI On the first run of the node configuration, a question for your validator address is being asked. On the first run you can not know the validator address. After creating your validator component above, you can now reconfigure your node as below, and provide the when questioned. babylonnode docker config -m CORE Alternatively you can also edit your config file usually located here ~/babylon-node-config/config.yaml and add the “validator_address” line and replace the placeholder with your validator address. core_node: core_release: ... data_directory: /home/ubuntu/babylon-ledger . . validator_address: Restart the node by executing: babylonnode docker install Your node should start up as a validator now - which can be verified that the validator address appears when you do this: babylonnode api system identity 4.2. If set-up with docker Adjust your docker compose to set the following environment variable: RADIXDLT_CONSENSUS_VALIDATOR_ADDRESS: Then restart. Your node should start up as a validator now - which can be verified that the validator address appears when you do this inside your container: curl http://localhost:3334/system/identity 4.3. If set-up with native JAR Configure with: consensus.validator_address= Then restart. Your node should start up as a validator now - which can be verified that the validator address appears when you do this: curl http://localhost:3334/system/identity 5. Configure your validator Go to “Send Raw Transaction” on the console [ Stokenet (/contents/tech/releases/stokenet) | Mainnet (https://console.radixdlt.com/transaction-manifest) ]. Copy in the following manifest, replacing the , and placeholders with their values from the previous steps, and inserting appropriate values for the values. For more details on the metadata standard, please see the metadata standard docs (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) . Stokenet (Testnet) # Generate proof of owner badge CALL_METHOD Address("") "create_proof_of_non_fungibles" Address("resource_tdx_2_1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxyerzzk") Array( NonFungibleLocalId(""), ); # Register your validator, so that it can be part of the validator set CALL_METHOD Address("") "register"; # Set your validator to accept stake from non-owners CALL_METHOD Address("") "update_accept_delegated_stake" true; # OPTIONAL - Set metadata according to the metadata standard # Feel free to remove any of these commands if you don't have things to put there SET_METADATA Address("") "name" Enum(""); SET_METADATA Address("") "description" Enum(""); SET_METADATA Address("") "icon_url" Enum(""); SET_METADATA Address("") "info_url" Enum(""); Mainnet # Generate proof of owner badge CALL_METHOD Address("") "create_proof_of_non_fungibles" Address("resource_rdx1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxvdrwnr") Array( NonFungibleLocalId(""), ); # Register your validator, so that it can be part of the validator set CALL_METHOD Address("") "register"; # Set your validator to accept stake from non-owners CALL_METHOD Address("") "update_accept_delegated_stake" true; # OPTIONAL - Set metadata according to the metadata standard # Feel free to remove any of these commands if you don't have things to put there SET_METADATA Address("") "name" Enum(""); SET_METADATA Address("") "description" Enum(""); SET_METADATA Address("") "icon_url" Enum(""); SET_METADATA Address("") "info_url" Enum(""); Submit this transaction, approve it in the wallet, and go to the results page in the dashboard. Note: You can use the "update_key", "update_fee" and "update_accept_delegated_stake" to update the validator’s Secp256k1PublicKey, decimal fee factor proportion, and accept stake boolean respectively. Updating the fee factor takes effect after a number of epochs. This fee factor update delay will be 100 epochs (500 minutes) for testnets, but 2 weeks of epochs for mainnet. Further updates to the fee factor in that time will reset the time till update. 6. Stake a little to your validator Go to “Send Raw Transaction” on the console [ Stokenet (/contents/tech/releases/stokenet) | Mainnet (https://console.radixdlt.com/transaction-manifest) ]. Copy in the following manifest, replacing the placeholders with their values from the previous steps: Stokenet (Testnet) # Withdraw 500 XRD from your account CALL_METHOD Address("") "withdraw" Address("resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc") Decimal("500"); TAKE_FROM_WORKTOP Address("resource_tdx_2_1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxtfd2jc") Decimal("500") Bucket("stake_xrd"); # Stake to your validator CALL_METHOD Address("") "stake" Bucket("stake_xrd"); # Deposit your liquid stake token back to your account CALL_METHOD Address("") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") None; Mainnet # Withdraw 500 XRD from your account CALL_METHOD Address("") "withdraw" Address("resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd") Decimal("500"); TAKE_FROM_WORKTOP Address("resource_rdx1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxradxrd") Decimal("500") Bucket("stake_xrd"); # Stake to your validator CALL_METHOD Address("") "stake" Bucket("stake_xrd"); # Deposit your liquid stake token back to your account CALL_METHOD Address("") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") None; Submit this transaction, approve it in the wallet, and go to the results page in the dashboard. Note - Instead of the "stake" method, there is also a "stake_as_owner" method which requires the owner badge, but can allow you to stake even if your validator does not currently accept delegated stake. 7. Lock some stake as owner stake New at Babylon is the concept of "Owner Stake". This works a little differently compared to Olympia. The "Owner Stake" concept has the same purpose - proving to the staking community that the validator runner is committed to running the validator in an orderly fashion, with the aim of giving confidence to community stakers that the node is reputable and worth staking to. But, because stake units can be traded, just holding stake units in your account isn’t enough to prove that you have such a commitment, and is hard to define. Instead, Owner stake is validator stake units which are "locked" into the validator, in the owner stake vault. These can be requested to be unlocked, which immediately removes them from the owner stake vault, and instead, puts them in an unlocking vault. This process completes after a delay of a number of epochs, after which the owner stake units can be claimed. Multiple batches can be claimed in parallel, up to a limit (a few 100 parallel batches). This unlock delay will be 100 epochs (500 minutes) for testnets, but 4 weeks of epochs for mainnet. It is purposefully longer than the unstake delay, to allow a large owner unlock to be caught by the community and to allow time for users to unstake. Owner tips, fees and emissions at the end of epochs where your validator was active will automatically be staked, and sent to the owner stake vault. Stokenet (Testnet) # Generate proof of owner badge CALL_METHOD Address("") "create_proof_of_non_fungibles" Address("resource_tdx_2_1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxyerzzk") Array( NonFungibleLocalId(""), ); # Withdraw 250 stake units from your account CALL_METHOD Address("") "withdraw" Address("") Decimal("250"); TAKE_FROM_WORKTOP Address("") Decimal("250") Bucket("stake_units_to_lock"); # Lock owner stake to your validator CALL_METHOD Address("") "lock_owner_stake_units" Bucket("stake_units_to_lock"); Mainnet # Generate proof of owner badge CALL_METHOD Address("") "create_proof_of_non_fungibles" Address("resource_rdx1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxvdrwnr") Array( NonFungibleLocalId(""), ); # Withdraw 250 stake units from your account CALL_METHOD Address("") "withdraw" Address("") Decimal("250"); TAKE_FROM_WORKTOP Address("") Decimal("250") Bucket("stake_units_to_lock"); # Lock owner stake to your validator CALL_METHOD Address("") "lock_owner_stake_units" Bucket("stake_units_to_lock"); You can start unlocking stake units with "start_unlock_owner_stake_units" Decimal("") and claim stake units which have finished unlocking with "finish_unlock_owner_stake_units" and then depositing the returned stake units to your account. 8. Checking your validator is active This only applies if you’ve made it to the top 100 validators. Wait 5 minutes or so for the next epoch, then check the identity endpoint of your node again. This time, it should include "consensus_status": "VALIDATING_IN_CURRENT_EPOCH". Also check out the validators page on the dashboard and see if you can spot your validator. 9. [Stokenet only] Request the RDX team stake to your validator Post on the #node-runners channel on the Radix DiscRadix Discord servererver (https://go.radixdlt.com/Discord) , and the Network Team will be able to stake a large chunk of XRD to your validator, so it can get a decent count of rounds in consensus. ## Signalling Protocol Update Readiness URL: https://radix.wiki/developers/legacy-docs/run/node/node-maintenance-and-administration/node-protocol-updates Updated: 2026-02-18 Summary: For the Radix network to commit transactions, it must come to an agreement on the outcome of those transactions. This requires all the nodes constituting the ne Run > Node > Maintenance and Administration > Signalling Protocol Update Readiness — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-protocol-updates.md) Introduction For the Radix network to commit transactions, it must come to an agreement on the outcome of those transactions. This requires all the nodes constituting the network to be operating according to the same set of rules. These rules are known as the protocol of the network. As Radix is further developed, the protocol will be expanded and amended, resulting in the publishing of new nodes, which can support a new protocol version. But if the nodes transitioned to using this new protocol version as soon as they were updated, the network would break because the network wouldn't all be using the same set of rules. Instead, the network needs to co-ordinate, and ensure that nodes pick up the same version at the same time. There are known as triggers of the protocol update. The node currently supporst two mechanisms to achieve this: - Unconditional enactment at the start of a given epoch: - This is useful for genesis, test environments, and for hard-coding upgrades after-the-fact. - Nodes must have been updated by the time this epoch is hit, otherwise the network will suffer incompatibility / liveness issues. - Validator readiness-signal based enactment at the start of an epoch: - This is the standard option for mainnet protocol updates. - Validator owners must signal readiness for the update by making a call to their validator component with the readiness signal for the given update. - Each validator has a space for one optional readiness signal at any time. - The readiness signal is not just the protocol name, but instead a unique string derived from the protocol version and its specific trigger condition. - There are a combination of constraints that must be valid to enact the update: - A lower_bound_epoch_inclusive - We must be at the start of this epoch or a future epoch to enact the update. - A upper_bound_epoch_exclusive - The update is no longer enactable at the start of this epoch. - One of more readiness_thresholds. Each of these includes a required_ratio_of_stake_supported (between 0 and 1) and a required_consecutive_completed_epochs_of_support (0 or more). At least one readiness threshold must be met at the start of an epoch to enact the update. A threshold is met if validators representing at least required_ratio_of_stake_supported of the total active validator set's stake are currently signalling for the required readiness signal, AND have also been doing so for all of the last required_consecutive_completed_epochs_of_support additional epochs. Protocol Versions The original protocol version was called babylon, and introduced with the v1.0.0 node and immediately enacted. On top of it, consecutive protocol updates are being released and enacted after gathering enough voting power (i.e. share of total stake). Validators have to individually signal readiness for a particular protocol version, and maintain it for some number of consecutive epochs. To limit the voting time, the enactment itself must happen within a defined epoch range. The table below summarizes the protocol updates configured (https://github.com/radixdlt/babylon-node/blob/main/core-rust/state-manager/src/protocol/protocol_configs/mainnet_protocol_config.rs) for mainnet so far: Version Readiness signal name Enactment epoch range Voting requirement Anemone (Node v1.1.0) | 220e2a4a4e86e3e6000000000anemone | [70019; 74051) (from ~`2024-02-05T18:00:00Zto2024-02-19T18:00:00Z`) | 75% stake for ~4 days | | Bottlenose (Node v1.2.0) | 86894b9104afb73a000000bottlenose | [104291; 112355) (from ~`2024-06-03T18:00:00Zto2024-07-01T18:00:00Z`) | 75% stake for ~2 weeks | | Cuttlefish (Node v1.3.0) | 96e00440adafe5e2000000cuttlefish | [158682; 161562) (from ~`2024-12-10T16:03:58.703Zto2024-12-20T16:03:58.703Z`) | 75% stake for ~2 weeks | Other internal and public test networks have their separate configurations. The one for stokenet can be found here (https://github.com/radixdlt/babylon-node/blob/main/core-rust/state-manager/src/protocol/protocol_configs/stokenet_protocol_config.rs) . The most definite protocol update status of a specific Node can be queried from its System Health API endpoint (by default: http://localhost:3334/system/health). It will contain a list of enacted and pending protocol updates (including their readiness_signal_names). Protocol Update Execution Each new protocol version will come with a potentially new engine configuration, and can also inject new transactions - currently these are limited to system "flash" transactions which update engine substates. The final consensus proof before an update signs off on the fact a protocol update will be enacted, this ensures that the previous epoch doesn't end unless a quorum of validators definitely agree that the enactment should proceed. The update execution then commits zero or more batches of transactions - these come with new ledger proofs with a protocol update execution origin. Readiness Signal Process Validators who wish to signal their readiness should follow the instructions below. Update your validator node Update your node to the latest version, which includes the pending protocol update. Once you do so, verify that your node has been succesfully updated: - If you're using the babylonnode CLI: babylonnode api system version - Otherwise, you can use curl, e.g.: curl http://127.0.0.1:3334/system/version Then check the System Health API for the required readiness signal: - If you're using the babylonnode CLI: babylonnode api system health - Otherwise, you can use curl, e.g.: curl http://127.0.0.1:3334/system/health (if on docker, this will need to be inside the container, or on the port you have mapped to 3334). You'll find readiness_signal_status in the response for the given protocol version, which should be READINESS_NOT_SIGNALLED. It will also include a readiness_signal_name, if the protocol version has a validator readiness trigger. It should be a string 32 characters long. For example, for anemone on mainnet it is 220e2a4a4e86e3e6000000000anemone - but it will be different for other protocol versions. Don't just copy it from here, verify against your system API first! Signal readiness This is done by submitting a transaction, authorized with your validator owner badge, similar to validator registration (https://docs-babylon.radixdlt.com/main/node-and-gateway/register-as-validator.html) , or performing other validator actions (see e.g. this discussion on radix talk (https://radixtalk.com/t/validator-transaction-manifests/1886) ). If you are using a tool to manage your validator (such as Faraz's CLI tool), consult the tool owner / documentation. If you are using a validator badge owner by an account under a Radix wallet, you can use the dev console to submit the transaction manifest (https://console.radixdlt.com/transaction-manifest) . You must use a wallet that controls the account containing your validator owner badge to sign the transaction! If you've forgotten where it lives, see the "Owner" by clicking on your validator on the staking dashboard (https://dashboard.radixdlt.com/network-staking) . You can also find your owner badge id and validator address there too. Submit the following manifest, replacing the four placeholders with their appropriate values: CALL_METHOD Address("") "create_proof_of_non_fungibles" Address("resource_rdx1nfxxxxxxxxxxvdrwnrxxxxxxxxx004365253834xxxxxxxxxvdrwnr") Array( NonFungibleLocalId("") ); CALL_METHOD Address("") "signal_protocol_update_readiness" ""; Verify successful signalling After your transaction is successfully committed, re-check the System Health API: - If you're using the babylonnode CLI: babylonnode api system health - Otherwise, you can use curl, e.g.: curl http://127.0.0.1:3334/system/health The readiness_signal_status should update to READINESS_SIGNALLED. Monitoring validator readiness signals For a quick overview of the validator set's readiness signals, you can check out community validator dashboards such as Bart's StakeSafe dashboard (https://validators.stakesafe.net/) . If you want to monitor this yourself: - The current readiness signal for any validator individually is available from the core/state/validator endpoint. At any given time, a validator can signal readiness for a single protocol version readiness signal. - The current readiness signal of all the validators in the active set can be extracted from core/state/consensus-manager with an opt-in in the request, grouped by the signal. This is the easiest way to see "how close" the network is to a threshold. - The node exposes various metrics in the ng_protocol sub-namespace, for various protocol update related needs - Finally, for monitoring when something got enacted, the /stream/proofs API can be used to filter for state update initialization and execution proofs. It can also be used to read ledger headers (our equivalent of block headers). ## Monitoring node's health URL: https://radix.wiki/developers/legacy-docs/run/node/node-maintenance-and-administration/node-monitoring-health Updated: 2026-02-18 Summary: Radix node provides the System API that can be used to monitor its health. Run > Node > Maintenance and Administration > Monitoring node's health — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-monitoring-health.md) Radix node provides the System API that can be used to monitor its health. You can query your own node at these endpoints to get various kinds of data about the the operation of the node. The easiest way to call the endpoints is through the babylonnode script (see Installing the babylonnode CLI (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-cli.md) for more information), but you can also use curl, for example. The /health endpoint The most basic check on your node’s health is ensuring that it is running and syncing with the network. This is what the /health endpoint is used for. Execute the following command to check the status of your node: babylonnode babylonnode api system health curl curl -k -u admin:nginx-password "https://localhost/system/health" The call returns a simple status message that is easy to check and monitor, like this: { "status": "UP", ... } The status message will be one of the following codes: - BOOTING_PRE_GENESIS the node is booting and not ready to accept requests - SYNCING the node is catching up the network - UP the node is in sync with consensus - OUT_OF_SYNC the node is out of sync Refer to Health and Metrics (https://github.com/radixdlt/babylon-node/blob/main/docs/logging-and-monitoring/health-metrics.md) documentation for more details on monitoring node’s health. The /prometheus/metrics endpoint The /metrics endpoint provides a wealth of performance and operational data. While it can be queried directly, it is designed for use with monitoring and alerting dashboards, such as Grafana (https://grafana.com/) , and so provides its data in the Prometheus data format (https://prometheus.io/docs/instrumenting/exposition_formats/#text-based-format) . Radix provides a convenient Grafana monitoring dashboard installation through the babylonnode CLI (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-setting-up-grafana.md) . ## Maintenance and Administration URL: https://radix.wiki/developers/legacy-docs/run/node/node-maintenance-and-administration/node-maintenance-and-administration Updated: 2026-02-18 Summary: Legacy documentation: Maintenance and Administration Run > Node > Maintenance and Administration — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/README.md) ## Update the Node URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/manual-setup-advanced/node-setup-systemd/node-setup-systemd-update Updated: 2026-02-18 Summary: These instructions will show you how to upgrade nodes installed as systemd instances. Run > Node > Installation and Basic Setup > Manual Setup (Advanced) > Manual Node Setup with systemd > Update the Node — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-systemd/node-setup-systemd-update.md) Introduction These instructions will show you how to upgrade nodes installed as systemd instances. Please read the instructions all the way through first, before applying the changes. Prepare to update First, if you haven’t already done so, ensure that you have backed up your keystore file (e.g. keystore.ks). This key file contains the private key that determines your node’s unique address. If anything goes wrong in the update process, if you have your key file, you can always reinstall the node from scratch and use it to recover access to your node with its previous network identity. Next, you may want to consider using a backup node to perform a switch to the updated node with minimal interruption (especially if running a validator node) – or to provide a quick recovery if something goes wrong during the update. See our recommendations for Maintaining Uptime for more. Download the latest node distribution - Go to https://github.com/radixdlt/babylon-node/releases (https://github.com/radixdlt/babylon-node/releases) and look for the entry with the [ Latest release ] marker. - Download the corresponding node distribution zip file and the native library file (see the Node Setup instructions (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-systemd/README.md) “Downloading the Radix node software” section for details) - Unzip both files and place them in some directory on your system (e.g. /opt/radixdlt/babylon-node/core-v1.2.1 ), again, this is analogous to node setup docs An example folder structure could looks like this: tree /opt/radixdlt/babylon-node -L 2 /opt/radixdlt/babylon-node ├── core-v1.2.0 │   ├── bin │   └── lib └── libcorerust.so ├── core-v1.2.1 │   ├── bin │   └── lib └── libcorerust.so where core-v1.2.0 is the old (current) version and core-v1.2.1 is the new one. Make sure to put a correct libcorerust.so file in a new directory! Always use the one that comes with the rest of the node dist, don’t copy it from the previous version! Make sure that all new files have a correct owner: sudo chown -R radixdlt:radixdlt /opt/radixdlt Stop the node process Before upgrading the software, you’ll need to shut the node down: sudo systemctl stop babylon-node Update the systemd unit file We can now point the systemd service to use a new binary. Update the ExecStart= line in your systemd unit file (e.g. /etc/systemd/system/babylon-node.service) to point to a directory you’ve created in the previous step. E.g.: ExecStart=/opt/radixdlt/babylon-node/core-v1.2.1/bin/core -config=/srv/radixdlt/babylon-mainnet/mainnet.config And reload the configuration: sudo systemctl daemon-reload Save your existing nginx configuration If you still have an existing nginx configuration from your previous installation then it’s a good idea to move it to another location, e.g.: sudo mv babylon-nginx-fullnode-conf.zip babylon-nginx-fullnode-conf.zip.spare Download the latest nginx configuration - Go to https://github.com/radixdlt/babylon-nginx/releases (https://github.com/radixdlt/babylon-nginx/releases) and look for the entry with the [ Latest release ] marker. - Download the babylon-nginx-fullnode-conf.zip file. Install the new nginx configuration - Unzip the nginx configuration. (You can overwrite all the files) unzip babylon-nginx-fullnode-conf.zip - Copy the files to the nginx setup directory. sudo cp -r conf.d/ /etc/nginx/ - And now copy the nginx configuration files for your node type. If you are running a full node then execute: sudo cp nginx-fullnode.conf /etc/nginx/nginx.conf Restart nginx and the node sudo systemctl restart babylon-node sudo systemctl restart nginx Verify that the node has been upgraded You can check the version of the node software by sending an information request using curl: curl -k -u admin:nginx-password "https://localhost/system/version" The response is a JSON string that carries the version number as its payload, e.g.: {"version":"v1.2.1"} ## Manual Node Setup with systemd URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/manual-setup-advanced/node-setup-systemd/node-setup-systemd Updated: 2026-02-18 Summary: Welcome to the Radix node manual setup guide! Run > Node > Installation and Basic Setup > Manual Setup (Advanced) > Manual Node Setup with systemd — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-systemd/README.md) Introduction Welcome to the Radix node manual setup guide! Before we begin, make sure you’ve read the Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) . On this page we’ll guide you through the process of setting up your node to run directly on the host system, and we’ll do it manually (i.e. without the use of our helper babylonnode CLI). Using the CLI is an alternative and easier way (for dedicated server instances). Manual mode is however better suited for shared servers (i.e. when the server isn’t solely dedicated to running a Radix node) or non-Ubuntu hosts. If you wish to switch to a CLI path, go to Guided Setup Instructions (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/README.md) . There’s also an alternative path that utilizes Docker here (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-docker.md) (also manually, i.e. without using our CLI). You can use any compatible operating system (again, check Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) for details), however all examples in this guide will be for Ubuntu 22.04. Consult your system’s manual for the corresponding packages/commands. Prerequisites Update the system First, let’s make sure that your system is up to date, if you’re on Ubuntu run: sudo apt update && sudo apt -y upgrade Install Java Then, we’ll need Java (version 17). We recommend using OpenJDK. On Ubuntu it’s the openjdk-17-jdk package: sudo apt install openjdk-17-jdk Configure the firewall You’ll need to open the p2p gossip port, which by default is 30000. sudo ufw allow 30000/tcp If you are using a cloud service then you must also arrange for external port access through your service provider: this is usually done through the service management console. If you are hosting the service yourself, then you may need to open access to the ports through your hardware router and/or configure port forwarding. Recommended: create a dedicated user for running the node We recommend to run the node as a dedicated system user, rather than the root user. We’ll use the radixdlt user name throughout this tutorial, however you can choose any name you like. sudo useradd radixdlt -m # create a new user sudo passwd radixdlt # set its password Preparing the directory structure We’ll need a place to store both the node software package and the actual ledger data. You can choose any directory on your system, as long as it’s accessible to the user that runs the node process (`radixdlt` user in this tutorial). We’ll also need some place to put a keystore file (containing a private key that uniquely identifies your node on the network). An example folder structure could look like this: - /opt/radixdlt/babylon-node - for the node software package - /srv/radixdlt/babylon-mainnet/data - for the node storage directory (ledger data) - /srv/radixdlt/babylon-mainnet/keystore.ks - for the keystore file - /srv/radixdlt/babylon-mainnet/mainnet.config - for the configuration file Make sure to replace the paths in the steps below if you choose to use a different folder structure. If you’re setting up a Stokenet node (you can read more about the different networks on the Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) page), you may wish to replace “mainnet” with “stokenet”. Downloading the Radix node software Radix node consists of two components: the main Java application and an additional native library (.dll or .so file, depending on your system). You can find the latest release on our GitHub: babylon-node/releases/latest (https://github.com/radixdlt/babylon-node/releases/latest) . The main Java application is the zip file named babylon-node-va.b.c.zip (where a.b.c is the version number). Unlike the Java application, native library is platform-dependant. You should download the file that corresponds to your system architecture. For example for 64-bit Linux it’s the: babylon-node-rust-arch-linux-x86_64-release-va.b.c.zip file. Download and extract both files, you can use wget. For example: # install wget and unzip if they don't come preinstalled on your system sudo apt install wget unzip # make sure to replace the version (always use the latest!) and platform architecture wget https://github.com/radixdlt/babylon-node/releases/download/v1.2.1/babylon-node-v1.2.1.zip wget https://github.com/radixdlt/babylon-node/releases/download/v1.2.1/babylon-node-rust-arch-linux-x86_64-release-v1.2.1.zip # extract both files unzip babylon-node-1.2.1.zip unzip babylon-node-rust-arch-linux-x86_64-release-1.2.1.zip Following the directory structure we outlined above, let’s move both files to /opt/radixdlt/babylon-node: sudo mkdir -p /opt/radixdlt/babylon-node sudo mv core-v1.2.1 /opt/radixdlt/babylon-node sudo mv libcorerust.so /opt/radixdlt/babylon-node We should also make sure that the files are accessible/owned by the radixdlt user: sudo chown -R radixdlt:radixdlt /opt/radixdlt The resultant folder structure should look like this: tree /opt/radixdlt/babylon-node -L 2 /opt/radixdlt/babylon-node ├── core-v1.2.1 │   ├── bin │   └── lib └── libcorerust.so Generating the key pair The keystore contains a randomly-generated private key that determines your node’s unique address. This means that if you lose your keystore file, you will forever lose your node address and you’ll need to generate a new key for your node. We’ll be storing the key in /srv/radixdlt/babylon-node/keystore.ks , so let’s make sure the directory exists and has a correct owner/permissions: sudo mkdir -p /srv/radixdlt/babylon-mainnet/data sudo chown -R radixdlt:radixdlt /srv/radixdlt Now let’s run the key generator tool that comes with the Radix node distribution: sudo /opt/radixdlt/babylon-node/core-v1.2.1/bin/keygen \ --keystore=/srv/radixdlt/babylon-mainnet/keystore.ks \ --password= It goes without saying that you should replace with an actual password. Use a strong one and don’t forget it! Again, make sure the file belongs to our user: sudo chown radixdlt:radixdlt /srv/radixdlt/babylon-mainnet/keystore.ks Preparing the configuration file It’s time to choose the network that your node will join (you can read more about the different networks on the Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) page). - Mainnet - use network.id=1 - use the following network.p2p.seed_nodes radix://node_rdx1qf2x63qx4jdaxj83kkw2yytehvvmu6r2xll5gcp6c9rancmrfsgfw0vnc65@babylon-mainnet-eu-west-1-node0.radixdlt.com,radix://node_rdx1qgxn3eeldj33kd98ha6wkjgk4k77z6xm0dv7mwnrkefknjcqsvhuu4gc609@babylon-mainnet-ap-southeast-2-node0.radixdlt.com,radix://node_rdx1qwrrnhzfu99fg3yqgk3ut9vev2pdssv7hxhff80msjmmcj968487uugc0t2@babylon-mainnet-ap-south-1-node0.radixdlt.com,radix://node_rdx1q0gnmwv0fmcp7ecq0znff7yzrt7ggwrp47sa9pssgyvrnl75tvxmvj78u7t@babylon-mainnet-us-east-1-node0.radixdlt.com - Stokenet - use network.id=2 - use the following network.p2p.seed_nodes radix://node_tdx_2_1qv89yg0la2jt429vqp8sxtpg95hj637gards67gpgqy2vuvwe4s5ss0va2y@babylon-stokenet-ap-south-1-node0.radixdlt.com,radix://node_tdx_2_1qvtd9ffdhxyg7meqggr2ezsdfgjre5aqs6jwk5amdhjg86xhurgn5c79t9t@babylon-stokenet-ap-southeast-2-node0.radixdlt.com,radix://node_tdx_2_1qwfh2nn0zx8cut5fqfz6n7pau2f7vdyl89mypldnn4fwlhaeg2tvunp8s8h@babylon-stokenet-eu-west-1-node0.radixdlt.com,radix://node_tdx_2_1qwz237kqdpct5l3yjhmna66uxja2ymrf3x6hh528ng3gtvnwndtn5rsrad4@babylon-stokenet-us-east-1-node1.radixdlt.com Create a /srv/radixdlt/babylon-mainnet/mainnet.config file with the following content: # The ID of the network to connect to (Mainnet=1, Stokenet=2), see above network.id=1|2 # A comma-separated list of network seed nodes, copy-paste from above (must match network.id) network.p2p.seed_nodes= node.key.path=/srv/radixdlt/babylon-mainnet/keystore.ks db.location=/srv/radixdlt/babylon-mainnet/data Remember to use your paths in node.key.path and db.location if they differ from the examples. There are a few additional config options that you might wish to add to your config file at this point. They’re all optional and we provide reasonable defaults, however you may want to adjust some of them to fit your circumstances: Additional config options # Here you can specify your node's public IP address # you can get it from ifconfig.me, for example. # You can leave this empty, in which case the node will # try to get it automatically. network.host_ip= # The address that the Core API will bind to # Defaults to 127.0.0.1 # It is recommended to keep the default and use # a reverse-proxy (nginx) to access the Core API # externally (if you need it). # We show how to do it later on this page. api.core.bind_address= # Core API port defaults to 3333, you can chance it here api.core.port= # Similarly, System API binds to 127.0.0.1 by default # and we recommend to keep it and use a reverse-proxy. api.system.bind_address= # System API port defaults to 3334, you can chance it here api.system.port= # Similarly to above APIs, this defaults to 127.0.0.1 api.prometheus.bind_address= # Prometheus API port defaults to 3335, you can chance it here api.prometheus.port= # This is the local port that the P2P server binds to # If you change it, make sure to enable that port in your firewall # (instructions how to do that have been shown above). network.p2p.listen_port=30000 # This is a P2P port that's broadcasted to other peers (together with `network.host_ip`). # If you're not using NAT/port-forwarding this should be the same as `listen_port`. network.p2p.broadcast_port=30000 # An additional ledger index for transaction data, enabled by default db.local_transaction_execution_index.enable=true|false # An additional ledger index for transactions that change accounts, enabled by default db.account_change_index.enable=true|false # Enabled by default, allows to disable a subset of Core API endpoints whose responses are potentially unbounded api.core.flags.enable_unbounded_endpoints=true|false Don’t forget to change the owner of the config file to radixdlt : sudo chown radixdlt:radixdlt /srv/radixdlt/babylon-mainnet/mainnet.config Running the node At this point we’re ready to actually run the node! We’ll start with a simple script that just runs the node process in a terminal session and then we’ll productionize our setup by introducing a systemd service for our node. Manual test run Running the node is just running an executable file that comes with node distribution. That executable is located in the bin folder: /opt/radixdlt/babylon-node/core-v1.2.1/bin/core (The path may vary if you’re running a never version than 1.2.1 which is used in this example. Always use the latest version!). We’ll need to tell it where to find the files it requires (the keystore, config file, etc) as well as your keystore password. It’ll also require some additional JVM parameters. For clarity, let’s put the required environment variables in a separate file, and then create a node run script. The environment file (e.g. /srv/radixdlt/babylon-mainnet/environment): # Enter your keystore password here RADIX_NODE_KEYSTORE_PASSWORD="" # Update java.library.path if you're using a different path for your node dist JAVA_OPTS="-Djava.library.path=/opt/radixdlt/babylon-node --enable-preview -server -Xms2g -Xmx12g -XX:MaxDirectMemorySize=2048m -XX:+HeapDumpOnOutOfMemoryError -XX:+UseCompressedOops -Djavax.net.ssl.trustStore=/etc/ssl/certs/java/cacerts -Djavax.net.ssl.trustStoreType=jks -Djava.security.egd=file:/dev/urandom -DLog4jContextSelector=org.apache.logging.log4j.core.async.AsyncLoggerContextSelector" # Update this if you're using a different path for your node dist LD_PRELOAD="/opt/radixdlt/babylon-node/libcorerust.so" Node run script (e.g. /srv/radixdlt/babylon-mainnet/run.sh): #!/bin/bash # Load our environment variables from a file set -a source /srv/radixdlt/babylon-mainnet/environment set +a # Run the node; update the paths if needed /opt/radixdlt/babylon-node/core-v1.2.1/bin/core -config=/srv/radixdlt/babylon-mainnet/mainnet.config At this point, the only thing left to do is to make the script executable and run it! # Let's also change the owner of both files... sudo chown radixdlt:radixdlt /srv/radixdlt/babylon-mainnet/environment sudo chown radixdlt:radixdlt /srv/radixdlt/babylon-mainnet/run.sh # ...and set proper permissions for the environment file sudo chmod 700 /srv/radixdlt/babylon-mainnet/environment # Make the script executable sudo chmod +x /srv/radixdlt/babylon-mainnet/run.sh # ...and run it as the radixdlt user sudo su - radixdlt -c /srv/radixdlt/babylon-mainnet/run.sh Verifying that the node works After you’ve run the script, your node should start processing the genesis transactions. You should be seeing messages that begin with “Committing data ingestion chunk…” in the output log. This may take up to 30 minutes to fully process, depending on your hardware. Usually it finishes in around 10-15 minutes. Warning Don’t terminate (or restart) the process while the genesis transactions are being committed! If you do so before the whole process is completed you’ll need to wipe the data directory (e.g. /srv/radixdlt/babylon-mainnet/data) and start from scratch. After the process completes you should start seeing messages like “lgr_commit{epoch= … }”, which indicate that the node is working correctly (at which point you can safely stop/restart it any time you wish). At this point you can also query the System API and the Core API. System API’s health endpoint provides an overall health summary of a node: curl 127.0.0.1:3334/system/health The response should include "status":"SYNCING. All possible statuses have been described in detail here (https://github.com/radixdlt/babylon-node/blob/main/docs/logging-and-monitoring/health-metrics.md) . Tip! jq is a great tool for querying the endpoints that return JSON data (such as node’s System and Core APIs). On Ubuntu you can install it with: sudo apt install jq We’ll use it from now on in the examples. But if you prefer not to install it, simply skip the | jq pipe suffix from the example commands. You can also inspect the Core API and query the current state of the ledger: curl 127.0.0.1:3333/core/status/network-status -X POST -H 'Content-Type: application/json' --data '{"network": "mainnet"}' | jq There’s a current_epoch_round field in the response and both the epoch and round number should be steadily growing as you re-run the query - this indicates that the node is correctly syncing the ledger state. Congratulations! At this point you’ve got a fully operational Radix node connected to the network. Note that it might take some time until it’s fully synced up with the latest ledger state. You can check the current epoch using one of the community-run Radix network dashboards on the internet. We highly encourage you to follow the remainder of this tutorial to productionize your node. Troubleshooting If your node isn’t running at this point you can drop a message on Discord (https://discord.com/invite/radixdlt) where Radix staff and community will be happy to help out. Creating a systemd service In rare cases would one be happy having to manually run a script every time the system reboots. To aid that, in this section we’ll create a systemd service that will manage our node process. We won’t be needing the run.sh script from the previous step anymore, but you’re free to keep it for debugging purposes. Just make sure that you’re not trying to run it at the same time when the systemd service is active. We’ll reuse the environment file from the previous section, so go ahead and create it, if you haven’t yet. Keep in mind that it’s a simple key-value file, not a bash script (so e.g. variable substitution like so ABC="$CDE" won’t work!). We’ll create a system-wide service for our node, but with minimal adjustment it could also run as a user service, if that’s what you prefer. In any case, we’ll still run the process under the radixdlt user. Create a systemd unit file (e.g. /etc/systemd/system/babylon-node.service): [Unit] Description=Radix Babylon Node After=local-fs.target After=network-online.target After=nss-lookup.target After=time-sync.target After=systemd-journald-dev-log.socket Wants=network-online.target [Service] EnvironmentFile=/srv/radixdlt/babylon-mainnet/environment User=radixdlt LimitNOFILE=65536 LimitNPROC=65536 LimitMEMLOCK=infinity WorkingDirectory=/opt/radixdlt/babylon-node/core-v1.2.1 ExecStart=/opt/radixdlt/babylon-node/core-v1.2.1/bin/core -config=/srv/radixdlt/babylon-mainnet/mainnet.config SuccessExitStatus=143 TimeoutStopSec=10 Restart=on-failure [Install] WantedBy=multi-user.target Don’t forget to update the paths if yours differ from the example! Verifying that the systemd service works Run the following command to start the service: sudo systemctl start babylon-node You can use journalctl to inspect the logs of a running node (or errors, if there are any): sudo journalctl -f -u babylon-node To stop the node use: sudo systemctl stop babylon-node Enabling systemd service autostart Run the following command to enable the node service at startup: sudo systemctl enable babylon-node.service Configuring a reverse-proxy (nginx) for node APIs nginx is a modern proxy server that we can use to securely expose our node’s APIs on a public network interface. Note that this is an optional step. You can skip it if you don’t need your node’s APIs to be accessible on the internet (you can still use e.g. ssh and query the API locally). Installing nginx To install nginx on Ubuntu run: sudo apt install -y nginx apache2-utils nginx comes with some predefined site directories that you’re not going to need, so you can delete them: sudo rm -rf /etc/nginx/{sites-available,sites-enabled} Download Radix Node nginx configuration files We do provide a set of ready to use configuration files for your nginx server, which you can download from https://github.com/radixdlt/babylon-nginx/releases (https://github.com/radixdlt/babylon-nginx/releases) . - Go to https://github.com/radixdlt/babylon-nginx/releases (https://github.com/radixdlt/babylon-nginx/releases) and look for the entry with the Latest release marker. - Download the babylon-nginx-fullnode-conf.zip file (e.g. copy its URL and use wget on your server). - Unzip the nginx configuration you’ve just downloaded: unzip babylon-nginx-fullnode-conf.zip - Copy the files to the nginx setup directory. sudo cp -r conf.d/ /etc/nginx/ - And now copy the nginx configuration files:" sudo cp nginx-fullnode.conf /etc/nginx/nginx.conf Create a cache directory nginx requires a cache directory for storing the reusable artifacts it downloads. Use the following command to create the cache: sudo mkdir -p /var/cache/nginx/radixdlt-hot Create the SSL Certificates You can use your own SSL certificates if you wish, but for convenience, you’ll find the instructions for creating a set here. TIP If you run into trouble generating the SSL key due to lack of entropy, try running: sudo apt install rng-tools sudo rngd -r /dev/random - Create the directory to hold the certificates: sudo mkdir /etc/nginx/secrets - Create the SSL keys using the following command: sudo openssl req -nodes -new -x509 -nodes -subj '/CN=localhost' -keyout "/etc/nginx/secrets/server.key" -out "/etc/nginx/secrets/server.pem" - And now execute this command to make sure the keys are in the correct format: sudo openssl dhparam -out /etc/nginx/secrets/dhparam.pem 4096 This command may take a minute or more to run. - Run the next command to set the authentication password for the server’s admin user: sudo htpasswd -c /etc/nginx/secrets/htpasswd.admin admin - Similary you can set the authentication password for the server’s metrics user: sudo htpasswd -c /etc/nginx/secrets/htpasswd.metrics metrics Open the firewall port sudo ufw allow 443/tcp Start nginx - Now, to start nginx, execute the following command: sudo systemctl start nginx - And now run this command to make sure that nginx starts up when the host server restarts: sudo systemctl enable nginx - You can check if the service is running by executing this command: curl -k -u admin:{nginx-admin-password} 'https://localhost/system/identity' --header 'Content-Type: text/plain' which should return a JSON response with node details, including its address: { "node_address": "node_...", ... } If you’re getting connection errors when trying to connect to the node, then you may need to restart both the node and nginx so they sync correctly. Try executing the following commands: sudo systemctl restart babylon-node sudo systemctl restart nginx ## Manual Setup with Docker URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/manual-setup-advanced/node-setup-docker Updated: 2026-02-18 Summary: Welcome to the Radix node manual setup guide! Run > Node > Installation and Basic Setup > Manual Setup (Advanced) > Manual Setup with Docker — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-docker.md) Introduction Welcome to the Radix node manual setup guide! Before we begin, make sure you’ve read the Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) . On this page we’ll guide you through the process of setting up your node to run inside a Docker container directly on the host system, and we’ll do it manually (i.e. without the use of our helper babylonnode CLI). Using the CLI is an alternative and easier way (for dedicated server instances). Manual mode is however better suited for shared servers (i.e. when the server isn’t solely dedicated to running a Radix node) or non-Ubuntu hosts. If you wish to switch to a CLI path, go to Guided Setup Instructions (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/README.md) . There’s also an alternative path that utilizes systemd, instead of Docker here (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-systemd/README.md) (also manually, i.e. without using our CLI). You can use any compatible operating system (again, check Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) for details), however all examples in this guide will be for Ubuntu 22.04. Consult your system’s manual for the corresponding packages/commands. We’ve split the installation into discrete stages: - Installing Docker and the secure key generation tools. - Using the keygen application to generate secure keys for your service. - Installing and running the node on a provisioned server instance. Note We’re using Amazon Web Services (https://aws.amazon.com/) throughout our example, but you can install Radix nodes on any cloud service that supports Java, such as Google’s cloud platform (https://cloud.google.com/) or Microsoft Azure (https://azure.microsoft.com/) – or follow similar step to deploy on a private server. Installing Docker Run the following commands to install docker and docker-compose. - Create the directory for the compose scripts: sudo apt update mkdir radixdlt cd radixdlt mkdir babylon-ledger All further instructions assume you are running the commands from radixdlt directory and use radixdlt/babylon-ledger as your storage directory. Make sure to update the docker-compose file if these should be changed. - Install wget, Docker and the tools for key generating randomized keys: sudo apt install wget docker.io docker-compose rng-tools # Setup the random number generator sudo rngd -r /dev/random - Add user to Docker group: sudo groupadd docker # ignore any errors if the group already exists sudo usermod -aG docker $USER newgrp docker Get the Docker Compose Script You will need a docker compose script to build the node, which you can download from the Radix repository on GitHub (https://github.com/radixdlt/babylon-nodecli/blob/main/node-runner-cli/templates/radix-fullnode-compose.yml.j2) directly to your server. Here we provide an example Docker compose file for Mainet/Stokenet nodes. Copy and paste the following code into a file and call it radix-fullnode-compose.yml. The list of all possible configuration is available here (https://github.com/radixdlt/babylon-node/blob/main/docker/config/default.config.envsubst) . Mainnet node version: '3.8' services: core: cap_add: - NET_ADMIN environment: RADIXDLT_NETWORK_ID: 1 RADIXDLT_NETWORK_SEEDS_REMOTE: "radix://node_rdx1qf2x63qx4jdaxj83kkw2yytehvvmu6r2xll5gcp6c9rancmrfsgfw0vnc65@52.212.35.209,radix://node_rdx1qgxn3eeldj33kd98ha6wkjgk4k77z6xm0dv7mwnrkefknjcqsvhuu4gc609@54.79.136.139,radix://node_rdx1qwrrnhzfu99fg3yqgk3ut9vev2pdssv7hxhff80msjmmcj968487uugc0t2@43.204.226.50,radix://node_rdx1q0gnmwv0fmcp7ecq0znff7yzrt7ggwrp47sa9pssgyvrnl75tvxmvj78u7t@52.21.106.232" JAVA_OPTS: --enable-preview -server -Xms4g -Xmx12g -XX:MaxDirectMemorySize=2048m -XX:+HeapDumpOnOutOfMemoryError -XX:+UseCompressedOops -Djavax.net.ssl.trustStore=/etc/ssl/certs/java/cacerts -Djavax.net.ssl.trustStoreType=jks -Djava.security.egd=file:/dev/urandom -DLog4jContextSelector=org.apache.logging.log4j.core.async.AsyncLoggerContextSelector RADIXDLT_LOG_LEVEL: info RADIXDLT_NETWORK_USE_PROXY_PROTOCOL: 'false' RADIXDLT_VALIDATOR_KEY_LOCATION: /home/radixdlt/node-keystore.ks RADIX_NODE_KEYSTORE_PASSWORD: "${RADIXDLT_NODE_KEY_PASSWORD}" image: radixdlt/babylon-node:v1.2.1 init: true mem_limit: 14000m restart: unless-stopped ulimits: memlock: -1 nofile: hard: 65536 soft: 65536 volumes: - babylon_ledger:/home/radixdlt/RADIXDB - ./node-keystore.ks:/home/radixdlt/node-keystore.ks nginx: environment: RADIXDLT_GATEWAY_API_ENABLE: 'true' RADIXDLT_GATEWAY_BEHIND_AUTH: 'true' RADIXDLT_NETWORK_USE_PROXY_PROTOCOL: 'false' RADIXDLT_TRANSACTIONS_API_ENABLE: 'false' image: radixdlt/babylon-nginx:1.0.8 ports: - 443:443 - 30000:30000 restart: unless-stopped ulimits: nofile: hard: 65536 soft: 65536 volumes: - nginx_secrets:/etc/nginx/secrets volumes: babylon_ledger: driver: local driver_opts: device: ./babylon-ledger o: bind type: none nginx_secrets: Stokenet (testnet) node version: '3.8' services: core: cap_add: - NET_ADMIN environment: RADIXDLT_NETWORK_ID: 2 RADIXDLT_NETWORK_SEEDS_REMOTE: "radix://node_tdx_2_1qv89yg0la2jt429vqp8sxtpg95hj637gards67gpgqy2vuvwe4s5ss0va2y@13.126.248.88,radix://node_tdx_2_1qvtd9ffdhxyg7meqggr2ezsdfgjre5aqs6jwk5amdhjg86xhurgn5c79t9t@13.210.209.103,radix://node_tdx_2_1qwfh2nn0zx8cut5fqfz6n7pau2f7vdyl89mypldnn4fwlhaeg2tvunp8s8h@54.229.126.97,radix://node_tdx_2_1qwz237kqdpct5l3yjhmna66uxja2ymrf3x6hh528ng3gtvnwndtn5rsrad4@3.210.187.161" JAVA_OPTS: --enable-preview -server -Xms4g -Xmx12g -XX:MaxDirectMemorySize=2048m -XX:+HeapDumpOnOutOfMemoryError -XX:+UseCompressedOops -Djavax.net.ssl.trustStore=/etc/ssl/certs/java/cacerts -Djavax.net.ssl.trustStoreType=jks -Djava.security.egd=file:/dev/urandom -DLog4jContextSelector=org.apache.logging.log4j.core.async.AsyncLoggerContextSelector RADIXDLT_LOG_LEVEL: info RADIXDLT_NETWORK_USE_PROXY_PROTOCOL: 'false' RADIXDLT_VALIDATOR_KEY_LOCATION: /home/radixdlt/node-keystore.ks RADIX_NODE_KEYSTORE_PASSWORD: "${RADIXDLT_NODE_KEY_PASSWORD}" image: radixdlt/babylon-node:v1.2.1 init: true mem_limit: 14000m restart: unless-stopped ulimits: memlock: -1 nofile: hard: 65536 soft: 65536 volumes: - babylon_ledger:/home/radixdlt/RADIXDB - ./node-keystore.ks:/home/radixdlt/node-keystore.ks nginx: environment: RADIXDLT_GATEWAY_API_ENABLE: 'true' RADIXDLT_GATEWAY_BEHIND_AUTH: 'true' RADIXDLT_NETWORK_USE_PROXY_PROTOCOL: 'false' RADIXDLT_TRANSACTIONS_API_ENABLE: 'false' image: radixdlt/babylon-nginx:1.0.8 ports: - 443:443 - 30000:30000 restart: unless-stopped ulimits: nofile: hard: 65536 soft: 65536 volumes: - nginx_secrets:/etc/nginx/secrets volumes: babylon_ledger: driver: local driver_opts: device: ./babylon-ledger o: bind type: none nginx_secrets: Generate the keys You’ll need the Radix Key Generator application to create secure keys for the node once it’s installed. (It’s a good idea to generate the keys first, just to get them out of the way). The keystore contains a randomly-generated private key that determines your node’s unique address. This means that if you lose your keystore file, you will forever lose your node address and you’ll need to generate a new key for your node. You can check for the latest version of the keygen program here: hub.docker.com/radixdlt/keygen (https://hub.docker.com/r/radixdlt/keygen/tags?page=1&ordering=last_updated) which should be used in the command given below to generate the secure keys. Change the --password= parameter in the following command to a secure password of your choice! Don’t forget it! docker run --rm -v ${PWD}:/keygen/key radixdlt/keygen:v1.4.1 --keystore=/keygen/key/node-keystore.ks --password=your-password If you check the directory, you should now have a key file called node-keystore.ks. The key generation process may take a long time if your server hasn’t generated a sufficiently large pool of random values to build fresh keys. Be prepared to wait up to twenty minutes for the key generation to complete. You must change the key file’s permissions so that container can use it. sudo chmod 644 node-keystore.ks And update the password in the radix-fullnode-compose.yml : RADIX_NODE_KEYSTORE_PASSWORD: "your-password" Configure the Ports The node requires that a number of ports are accessible on your server instance. Ensure that ports 443 and 30000 are available and can be seen externally. sudo ufw allow 30000/tcp sudo ufw allow 443/tcp Bear in mind that you must arrange for port access outside your cloud server instance: this is usually done through the management console provided by your cloud service. Configure nginx admin password The node uses  nginx (https://www.nginx.com/)  as its front end server. During startup, it creates an HTTP basic auth user named admin. The password is generated automatically and printed to the logs. If you want to use your own password, you will need to use the Docker instruction below to set the password before running the node installation. Replace with your password. docker run --rm -v radixdlt_nginx_secrets:/secrets radixdlt/htpasswd:v1.1.0 htpasswd -bc /secrets/htpasswd.admin admin If you omit the command, then nginx will create a password and print it out in the log. It will not appear in the logs if the nginx container restarts for any other reason. To setup the metrics user, run the following command. Replace with your password: docker run --rm -v radixdlt_nginx_secrets:/secrets radixdlt/htpasswd:v1.1.0 htpasswd -bc /secrets/htpasswd.metrics metrics Running the node Run the compose script to install and run the node. docker-compose -f radix-fullnode-compose.yml up -d The docker-compose file uses relative paths and expects this command to be run from the radixdlt folder created in the first step. Verifying that the node works You can use docker logs -f radixdlt_core_1 to inspect the logs of a running node. A fresh node should start with processing the genesis transactions. You should be seeing messages that begin with “Committing data ingestion chunk…” in the output log. This may take up to 30 minutes to fully process, depending on your hardware. Usually it finishes in around 10-15 minutes. Warning Don’t stop (or restart) the Docker container while the genesis transactions are being committed! If you do so before the whole process is completed you’ll need to wipe the data directory (babylon-ledger) and start from scratch. After the process completes you should start seeing messages like “lgr_commit{epoch= … }”, which indicate that the node is working correctly (at which point you can safely stop/restart the container any time you wish). At this point you can also query the System API and the Core API. System API’s health endpoint provides an overall health summary of a node: curl -k -u "admin:" https://127.0.0.1/system/health The response should include "status":"SYNCING. All possible statuses have been described in detail here (https://github.com/radixdlt/babylon-node/blob/main/docs/logging-and-monitoring/health-metrics.md) . Tip! jq is a great tool for querying the endpoints that return JSON data (such as node’s System and Core APIs). On Ubuntu you can install it with: sudo apt install jq We’ll use it from now on in the examples. But if you prefer not to install it, simply skip the | jq pipe suffix from the example commands. You can also inspect the Core API and query the current state of the ledger: curl -k -u "admin:" https://127.0.0.1/core/status/network-status -X POST -H 'Content-Type: application/json' --data '{"network": "mainnet"}' | jq There’s a current_epoch_round field in the response and both the epoch and round number should be steadily growing as you re-run the query - this indicates that the node is correctly syncing the ledger state. Congratulations! At this point you’ve got a fully operational Radix node connected to the network. Note that it might take some time until it’s fully synced up with the latest ledger state. You can check the current epoch using one of the community-run Radix network dashboards on the internet. Troubleshooting If your node isn’t running at this point you can drop a message on Discord (https://discord.com/invite/radixdlt) where Radix staff and community will be happy to help out. ## Manual Setup (Advanced) URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/manual-setup-advanced/manual-setup-advanced Updated: 2026-02-18 Summary: Legacy documentation: Manual Setup (Advanced) Run > Node > Installation and Basic Setup > Manual Setup (Advanced) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/README.md) ## Updating the node URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/node-setup-guided/node-setup-guided-updating-node Updated: 2026-02-18 Summary: When updating your node, care should be taken to ensure the update goes smoothly. These instructions use the babylonnode CLI tool for an easy update process. Run > Node > Installation and Basic Setup > Guided Setup (Recommended) > Updating the node — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-updating-node.md) Introduction When updating your node, care should be taken to ensure the update goes smoothly. These instructions use the babylonnode CLI tool for an easy update process. 1. Prepare to update First, if you haven’t already done so, ensure that you have backed up your node-keystore.ks key file. This key file contains the private key that determines your node’s unique address. If anything goes wrong in the update process, if you have your key file, you can always reinstall the node from scratch and use it to recover access to your node. Next, you may want to consider using a backup node to perform a switch to the updated node with minimal interruption (especially if running a validator node) – or to provide a quick recovery if something goes wrong during the update. 2. Update your node Docker mode babylonnode docker install --update systemd mode babylonnode systemd install -u - The -u option specifies that this launch of the node will be an update, causing babylonnode to create a backup of the current configuration file and ensure that the node has stopped before applying the changes. - Optional - The -r specifies the release of the node software you wish to install. If not provided it will use the latest release from https://github.com/radixdlt/babylon-node/releases (https://github.com/radixdlt/babylon-node/releases) - Optional - The -i option is the external IP address of your server. More information on using babylonnode You can use babylonnode to carry out a variety of administrative tasks. For a full reference on the commands the babylonnode CLI offers, please see the documentation on the CLI GitHub repository (https://github.com/radixdlt/babylon-nodecli/blob/main/docs/command_reference.adoc) . If you have any questions or run into problems, you can get support from the Radix team members and our fantastic community on our Discord server. ## Installing the node URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/node-setup-guided/node-setup-guided-installing-node Updated: 2026-02-18 Summary: Using the babylonnode CLI tool, you can install dependencies, configure your node installation, and then install and start up your node. Run > Node > Installation and Basic Setup > Guided Setup (Recommended) > Installing the node — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-node.md) Introduction Using the babylonnode CLI tool, you can install dependencies, configure your node installation, and then install and start up your node. The CLI supports running the node either using Docker or via systemd. However, we recommend Docker as the more straightforward and easy-to-maintain method. Each step specifies the commands for each mode - you should only run the commands for the mode of your choice (e.g. only those for the “Docker mode”), not both. 1. Install the babylonnode CLI If you haven’t done so already, install the babylonnode CLI (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-cli.md) . 2. Install the dependencies on the host machine This is required only to be run during first time setup on a new host machine. This is done through one simple call of the babylonnode CLI script. Docker mode babylonnode docker dependencies Once the process has completed, you will be asked to log out of your ssh session and log back in. systemd mode babylonnode systemd dependencies The script will install the software necessary for systemd mode. You’ll be asked to create a password for the radixdlt user (that’s the default user that the node process runs under). Once you have entered and confirmed the password, you will be presented with a series of instructions to enable the radixdlt user to execute commands without the need for user intervention. Execute following commands so that the radixdlt user can use sudo commands without needing you to enter a password: $ sudo su $ echo "radixdlt ALL=(ALL) NOPASSWD:ALL" > /etc/sudoers.d/radixdlt $ exit 3. Install the node 3.1. Create the config file The CLI uses a two-phase config / install setup. First, you are lead through an interactive prompt to create a config.yaml file, which by default is stored at ~/babylon-node-config/config.yaml. This config file serves as a base for an actual node configuration, which is generated by babylonnode. It is also used for future upgrades, to avoid having to remember your settings later. Record the password safely A keystore file is created during the config command, and the password that was entered is written to the generated config.yaml file. If you wish, you can delete the line containing keystore_password, but ensure you store the keystore password value safely, as it will be required in a later step. Back up your keystore The keystore file contains a randomly-generated private key that determines your node’s identity for peer to peer communications, and can allow it to act as a validator, if a validator entity’s public key is set to your node’s. We recommend that you securely back up your key file as soon as you’ve generated it, and carefully protect it. Docker mode The config command has two modes for node setup. If you are running a node for the first time, we recommend using the CORE mode. If you are already running an existing node using docker, you will need to use the DETAILED mode to migrate the setup, as it allows you to create a config file based on the answers that you have from existing setup. CORE - This is a quick config mode to create a config file with simple defaults to run a non-validator full node. babylonnode docker config -m CORE DETAILED - This is a detailed config mode, where the user is asked all necessary questions based on the answers replied. babylonnode docker config -m DETAILED The config file is created from the input. It will be used during the installation command. At default you can find it at ~/babylon-node-config/config.yaml. systemd mode First, log in as radixdlt sudo su - radixdlt cd /home/radixdlt CORE - This is a quick config mode to create a config file with simple defaults to run a non-valiator full node. babylonnode systemd config -m CORE The config file is created from the input. It will be used during the installation command. At default you can find it at ~/babylon-node-config/config.yaml. Mainnet Seed Nodes radix://node_rdx1qf2x63qx4jdaxj83kkw2yytehvvmu6r2xll5gcp6c9rancmrfsgfw0vnc65@babylon-mainnet-eu-west-1-node0.radixdlt.com,radix://node_rdx1qgxn3eeldj33kd98ha6wkjgk4k77z6xm0dv7mwnrkefknjcqsvhuu4gc609@babylon-mainnet-ap-southeast-2-node0.radixdlt.com,radix://node_rdx1qwrrnhzfu99fg3yqgk3ut9vev2pdssv7hxhff80msjmmcj968487uugc0t2@babylon-mainnet-ap-south-1-node0.radixdlt.com,radix://node_rdx1q0gnmwv0fmcp7ecq0znff7yzrt7ggwrp47sa9pssgyvrnl75tvxmvj78u7t@babylon-mainnet-us-east-1-node0.radixdlt.com Stokenet Seed Nodes radix://node_tdx_2_1qv89yg0la2jt429vqp8sxtpg95hj637gards67gpgqy2vuvwe4s5ss0va2y@babylon-stokenet-ap-south-1-node0.radixdlt.com,radix://node_tdx_2_1qvtd9ffdhxyg7meqggr2ezsdfgjre5aqs6jwk5amdhjg86xhurgn5c79t9t@babylon-stokenet-ap-southeast-2-node0.radixdlt.com,radix://node_tdx_2_1qwfh2nn0zx8cut5fqfz6n7pau2f7vdyl89mypldnn4fwlhaeg2tvunp8s8h@babylon-stokenet-eu-west-1-node0.radixdlt.com,radix://node_tdx_2_1qwz237kqdpct5l3yjhmna66uxja2ymrf3x6hh528ng3gtvnwndtn5rsrad4@babylon-stokenet-us-east-1-node1.radixdlt.com 3.2. Install and setup the node Run the following command and pay close attention to parameters: Docker mode babylonnode docker install - Optional - Use parameter -f if the config file is different from the /home/ubunu/babylon-node-config/config.yaml. - Optional - Use parameter -a or --autoapprove to run this command without any prompts. It is only recommended for automation purpose. - Use parameter -u or --update to deploy latest versions of software. systemd mode - Log in as radixdlt sudo su - radixdlt - Then ensure that you are in the /home/radixdlt directory: cd /home/radixdlt - Run the following command, paying close attention to the parameters babylonnode systemd install - Optional - The -i option is the external IP address of your server. - Optional - The -u flag is optional and is used only if you are upgrading from a previous version of the node. If applied to the command, it will create a backup of the old configuration file and ensure that the node has stopped before applying the changes. - Optional - The -t setting is the address of a Radix node you can use to join the network (a seed node). Select one from the list of seed nodes. 4. Set passwords for the nginx server By default there are two separate users configured for accessing the APIs: - admin: for accessing the Core API (/core/* endpoints) and the System API (/system/* endpoints) - metrics: for accessing /prometheus/metrics - Execute the following commands to set the passwords for the nginx users: Docker mode User Command to run admin babylonnode auth set-admin-password --setupmode DOCKER metrics babylonnode auth set-metrics-password --setupmode DOCKER systemd mode User Command to run admin babylonnode auth set-admin-password --setupmode SYSTEMD metrics babylonnode auth set-metrics-password --setupmode SYSTEMD - Enter your password at the prompt - As shown on screen, enter the following commands to set an environment variable for your password (remember to fill in your own password): echo 'export NGINX_ADMIN_PASSWORD="nginx-password"' >> ~/.bashrc echo 'export NGINX_METRICS_PASSWORD="nginx-password"' >> ~/.bashrc - Add the new environment variable to your session by executing the following command: source ~/.bashrc 5. Verifying that the node works After you’ve installed the node using the commands above, it should start automatically. It should start processing the genesis transactions. Depending on your setup mode you can use either docker logs -f ubuntu-core-1 or journalctl -f -u radixdlt-node to view the logs. You should be seeing messages that begin with “Committing data ingestion chunk…” in the output log. This may take up to 30 minutes to fully process, depending on your hardware. Usually it finishes in around 10-15 minutes. Warning Don’t terminate (or restart) the node while the genesis transactions are being committed! If you do so before the whole process is completed you’ll need to wipe the data directory and start from scratch. After the process completes you should start seeing messages like “lgr_commit{epoch= … }”, which indicate that the node is working correctly (at which point you can safely stop/restart it any time you wish). At this point you can also query the System API and the Core API. System API’s health endpoint provides an overall health summary of a node: babylonnode api system health The response should include "status":"SYNCING. All possible statuses have been described in detail here (https://github.com/radixdlt/babylon-node/blob/main/docs/logging-and-monitoring/health-metrics.md) . You can also inspect the Core API and query the current state of the ledger: curl -k -u "admin:nginx-password" https://127.0.0.1/core/status/network-status -X POST -H 'Content-Type: application/json' --data '{"network": "mainnet"}' | jq There’s a current_epoch_round field in the response and both the epoch and round number should be steadily growing as you re-run the query - this indicates that the node is correctly syncing the ledger state. Congratulations! At this point you’ve got a fully operational Radix node connected to the network. Note that it might take some time until it’s fully synced up with the latest ledger state. You can check the current epoch using one of the community-run Radix network dashboards on the internet. 7. Stopping your node This command will shut down a systemd or docker node, and is executed differently depending on which kind of installation you’re running. Docker mode babylonnode docker stop Optional parameter -f can be used to point to config.yaml file systemd mode babylonnode systemd stop Optional parameter -f can be used to point to config.yaml file 8. Restarting your node Restart the node using the following command: Docker mode babylonnode docker restart Optional parameter -f can be used to point to config.yaml file systemd mode babylonnode systemd restart Optional parameter -f can be used to point to config.yaml file 10. Advanced user config For advanced users, there is an additional file that can be optionally mounted to provide additional configuration to the babylon-node application which will be appended to the configuration when executing the install command. This allows to add configuration that the babylonnode CLI does not natively support. This file is located by default at ~/babylon-node-config/advanced-user.env-file for Docker and at ~/babylon-node-config/advanceduserconfig for systemd. The file will be picked up when executing install without needing to rerun the config command. Advanced user config - Docker mode Additional environment variables can be added by creating the file ~/babylon-node-config/advanced-user.env-file and writing environment variables in the format described here (https://docs.docker.com/compose/environment-variables/env-file/) . A file from a different location can be used during the install command by providing the option -aue or --advanceduserenvs followed with the desired location. Refer to the default.config substitution file (https://github.com/radixdlt/babylon-node/blob/main/docker/config/default.config.envsubst) to find out what variables will be converted to which config parameters. Please note that the file will only be mounted when executing install while the file is present at ~/babylon-node-config/advanced-user.env-file or the provided custom path. Advanced user config - systemd mode Additional configurations can be provided by creating a file at ~/babylon-node-config/advanceduserconfig and adding plain configuration to it. This configuration will be appended to the default.config file during templating when executing the install command. A file from a different location can be used during the install command by providing the option -auc or --advanceduserconfig followed with the desired location. Refer to the default.config substition file (https://github.com/radixdlt/babylon-node/blob/main/docker/config/default.config.envsubst) to find out the format and supported variables. Please note that the file and changes to it will not be added when running start, restart or stop but only when executing the install command while the file at ~/babylon-node-config/advanceduserconfig or the provided custom location is present. More information on using babylonnode You can use babylonnode to carry out a variety of administrative tasks. For a full reference on the commands the babylonnode CLI offers, please see the documentation on the CLI GitHub repository (https://github.com/radixdlt/babylon-nodecli/blob/main/docs/command_reference.adoc) . If you have any questions or run into problems, you can get support from the Radix team members and our fantastic community on our Discord server. ## Installing the babylonnode CLI URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/node-setup-guided/node-setup-guided-installing-cli Updated: 2026-02-18 Summary: Start by ensuring that the package list on the system is up to date by running the following command from the terminal: Run > Node > Installation and Basic Setup > Guided Setup (Recommended) > Installing the babylonnode CLI — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-cli.md) 1. Update the packages Start by ensuring that the package list on the system is up to date by running the following command from the terminal: sudo apt update You’ll also need wget and curl, if they do not come preinstalled on your distribution: sudo apt install wget curl 2. Download and install the babylonnode CLI Now download the babylonnode CLI from its GitHub repository. - Go to babylon-nodecli/releases/latest (https://github.com/radixdlt/babylon-nodecli/releases/latest) - Copy the link to the script file for your distribution (e.g. babylonnode-ubuntu-22.04) - On the target system, download the binary using wget. Here’s an example command that downloads version 2.2.3 (but you should always use the latest one) for Ubuntu 22.04: wget -O babylonnode https://github.com/radixdlt/babylon-nodecli/releases/download/2.2.3/babylonnode-ubuntu-22.04 By using the babylonnode CLI you agree to the End User License Agreement (https://uploads-ssl.webflow.com/6053f7fca5bf627283b582c2/65006e3d3a8002f9e834320c_radixdlt.com_genericEULA.pdf) . Set the permissions on the script to executable: chmod +x babylonnode And move it to the /usr/local/bin/ directory, so that it’s accessible without the need to specify a full path: sudo mv babylonnode /usr/local/bin If you are installing the babylonnode CLI for use with an existing systemd node installation, then babylonnode should instead be moved to your radixdlt directory where it will expect to find your existing keystore file. You may also need to set the current user as the owner of the radixdlt directory. Next: Installing the Node (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-node.md) ## Guided Setup (Recommended) URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/node-setup-guided/node-setup-guided Updated: 2026-02-18 Summary: Welcome to the Radix node setup guide using the babylonnode CLI! Run > Node > Installation and Basic Setup > Guided Setup (Recommended) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/README.md) Welcome to the Radix node setup guide using the babylonnode CLI! In this mode, you’ll use our Python CLI too  (babylonnode) for a straightforward, guided installation of Radix nodes on an Ubuntu host. We recommend this method for most people. This process works best on a fresh, dedicated machine (e.g. cloud instance). babylonnode is a versatile tool that can not only assist in setting up a node and related tools, but also helps to manage it afterwards. This script has been tested on Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) , and requires Python version 3 to run (if you don’t have an up-to-date version installed, follow the guide here (https://phoenixnap.com/kb/how-to-install-python-3-ubuntu) ). The CLI does not support Windows at the moment. Before you begin, please read the Node Setup Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) , if you haven’t done so yet. Next: Installing the babylonnode CLI (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/node-setup-guided-installing-cli.md) ## Installation and Basic Setup URL: https://radix.wiki/developers/legacy-docs/run/node/node-setup/node-setup Updated: 2026-02-18 Summary: Welcome to the Radix node installation guide! Before we begin, let’s take a look at a high-level overview of the whole process. Please read Node Introduction(.. Run > Node > Installation and Basic Setup — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) Introduction Welcome to the Radix node installation guide! Before we begin, let’s take a look at a high-level overview of the whole process. Please read Node Introduction (https://github.com/gguuttss/radix-docs/blob/master/run/node/README.md) first, if you haven’t done so yet. Process overview A complete node setup can be broken down into the following steps: - Installing the node software, running it and connecting to the network This is the heart of the process that gets you a fully functional Radix node. - Strengthening your node’s security This includes, among other things, setting up an nginx proxy for your node’s APIs. - Learning how to monitor your node’s health and perform simple administrative tasks - (Optional) Advanced monitoring, optimization and configuration A node can be configured in a number of ways to suit different purposes (e.g. it might enable some additional indices). We discuss some of the possibilities later in this guide. - (Optional) Running additional software A few additional applications can be installed and connected to a running node, e.g.: - an advanced indexer with rich query capabilities: Network Gateway (https://github.com/radixdlt/babylon-gateway) - a Grafana dashboard - (Optional) Registering as a Validator Only for node runners who wish to become validators. Make sure you have a good understanding of what a validator node is (https://github.com/gguuttss/radix-docs/blob/master/run/node/README.md) before engaging in this step. In the first part of this guide we’ll focus on the top three items from above list as they’re typically done together. Node components Before we begin, let’s also take a look at the key components of a node. Each node consists of: - The primary software package: a zip file containing the Java application and all its dependencies (that’s the babylon-node-va.b.c.zip file on our Releases Page (https://github.com/radixdlt/babylon-node/releases) ) - An additional native library - a piece of compiled software written in Rust (babylon-node-rust-arch--release-va.b.c.zip file on our Releases Page (https://github.com/radixdlt/babylon-node/releases) ). Unlike the Java application, a different version of the native library is needed depending on the target system architecture - A file containing node’s cryptographic key used to uniquely identify the node on the network (keystore file) - A configuration file (depending on the setup mode, this could either be a Java properties file or a set of environment variables) - A directory where the node can store its files (which include a complete ledger, any additional indices, etc) Depending on the setup mode of your choice (described below) you’ll need to install/configure some of these things manually, while others will be taken care of for you. Radix networks There are two public Radix networks available that your node can connect to: the main/real network (Mainnet) and a public test network (Stokenet). Once you connect to one of them, your node can’t switch to the other, unless you reconfigure the node and point it to a fresh data directory. As you follow the tutorial, at some point you’ll be presented with a choice as to which network to setup your node for. Radix Foundation controls the Stokenet (testnet) stake distribution and which nodes are in the Active Validator Set. If you wish to setup a validator node and test its operation as an Active Validator, you can request that your node is staked XRD so that it can join the validator set. You can do so on the #stokenet-requests channel on our Discord server. Prerequisites Hardware requirements To run a reliable, performant node, we suggest that the node’s hardware profile should match or exceed the following specification: CPUs Memory Storage Network Bandwidth 4 16 GB 500 GB of SSD space (initially) Recommended at least 100 Mbps This is a rough guideline only and each node runner should monitor their node operation to ensure good performance for the intended task. System requirements The recommended operating system for a production node is Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) . All our node helper scripts have been tested on this system, as well as all code snippets in this documentation. If you choose to use a different operating system some of our tools might not be available (e.g. the babylonnode CLI) and you might need to adjust some of the code snippets. The node software can run on the following platforms: - darwin-aarch64 - darwin-x86_64 - linux-aarch64 - linux-x86_64 - windows-x86_64-gnu - windows-x86_64-msvc Additionally, your platform of choice must be capable of running a Java SE 17 compatible runtime environment. We recommend using OpenJDK. Cloud instance recommendations We’re assuming that you’re comfortable provisioning a cloud service instance and installing basic tools (Git, Docker) on it. You’ll also need a working knowledge of Git, and the UNIX command line. AWS provisioning is beyond the scope of this doc, but there are plenty of resources dotted around the web which will help you to get started (https://docs.aws.amazon.com/index.html?nc2=h_ql_doc_do) . Example instance configuration for AWS Model vCPU Memory Storage Operating System Network Bandwidth m6i-xlarge 4 16 GB Provision a gp2 storage volume (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volume-types.html) . You should initially provision 500 GiB of SSD space. Ubuntu 22.04.2.0 LTS (Jammy Jellyfish) (https://releases.ubuntu.com/22.04/) Up to 10 Gbps Choosing the installation mode On the next pages we'll guide you through the process of setting up a Radix node. There is more than one way to setup a production node. Choose the one that best suit your needs: Guided Mode (Recommended) In this mode, you’ll use our Python CLI tool (`babylonnode`) for a straightforward, guided installation of a Radix node on an Ubuntu host. We recommend this method for most people. This process works best on a fresh, dedicated machine (e.g. cloud instance). babylonnode is a versatile tool that can not only assist in setting up a node and related tools, but also helps to manage it afterwards. Proceed to Guided Mode docs (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/node-setup-guided/README.md) Manual (Advanced) Setup with Docker This section will guide you through a manual (i.e. without using the babylonnode CLI tool) setup where the node and related tools (e.g. nginx) are running inside Docker containers on a host machine. Proceed to Manual Setup with Docker docs (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-docker.md) Manual (Advanced) Setup with systemd This section will guide you through a manual (i.e. without using the babylonnode CLI tool) setup where the node and related tools (e.g. nginx) are running directly on a host machine. The processes will be managed by systemd. This method is suitable for those who want the most hands-on approach, and may be a useful reference for those figuring out installation on different operating systems. Proceed to Manual Setup with systemd docs (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/manual-setup-advanced/node-setup-systemd/README.md) ## Node URL: https://radix.wiki/developers/legacy-docs/run/node/node Updated: 2026-02-18 Summary: Anyone can run a node and join it to the Radix Public Network. However, before you jump in, it's important to understand a few things first! Run > Node — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/node/README.md) Before running a node Anyone can run a node and join it to the Radix Public Network.  However, before you jump in, it's important to understand a few things first! You do not need to run a node to use Radix There is no need to run a node of your own in order to submit transactions or see information about things that are on the ledger. The Radix Wallet will, by default, use a free Gateway provided by the Radix Foundation, and you can freely use the Dashboard (https://dashboard.radixdlt.com) to investigate on-ledger stuff. There is no “mining” on Radix Radix is a delegated proof-of-stake network. There’s no race to discover a magic hash, and spinning up a node won’t start earning you XRD. Only a subset of nodes participate in validation of transactions Any node can be optionally registered as a validator, meaning that it will seek to be part of the consensus process which runs and commits new transactions. However, at each network epoch (about every 5 minutes), only the top 100 validators (ordered by amount of delegated stake) will constitute the active validator set. If you wish for your node to be a validator, you will have to amass a significant quantity of delegated stake. Regardless of whether a node is in the validator set, it still keeps current with the network state, it can still accept transactions for submission and gossip them on, and it can still expose the Core API, which provides information about the current state. An Introduction to Radix Nodes What is a Radix Node? The Radix Node ( GitHub (https://github.com/radixdlt/babylon-node) ) is the building block of Radix Network infrastructure. Nodes connect together to conduct consensus on transactions, maintain the ledger, and provide other useful functions. Nodes can be deployed with various additional features sets. Typically, nodes are deployed for one of two distinct purposes: - Full Nodes are nodes whose purpose is to handle queries for ledger state and history (e.g. check transaction status, list account transactions, etc). Typically full nodes are configured to maintain a complete transaction history, and have various optional additional indices turned on. They can also accept transaction submissions to the network. - Validator Nodes’ main goal is to collect enough stake to become Active Validators, participate in Consensus and collect the XRD rewards for doing so. At the same time, they can still provides full node’s query capabilities, depending on the configuration. In case node’s built-in query capabilities turn out to be insufficient, we do provide a separate indexer software with a richer feature set: Network Gateway, which can be set up alongside Radix Node. Refer to the Network Gateway (https://github.com/radixdlt/babylon-gateway) section for details. All nodes on the Radix Network must maintain a complete ledger state, however there is a subset of nodes that, in addition to maintaining the complete state, actively participate in extending it (i.e. committing new transactions to the ledger). This process is called Consensus. Actors that participate in Consensus are called Active Validators and all of them constitute a Validator Set. In order to become an Active Validator one must first Register as a Validator (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-registering-as-a-validator.md) (after you’ve setup your node, of course). Validators are represented by an on-ledger entity: Validator Component. Registration creates a new Validator Component entity and opens it to accept XRD token “stake” from delegators. A Validator Component can be (and practically always is) associated with a node on the network - a Validator Node. A Validator that reaches the top 100 (by the amount of delegated stake) is included in the Validator Set for the given epoch and becomes an Active Validator. Their associated node is called Active Validator Node**.** All other Validators that have been Registered but didn’t make it to the top 100 are called Inactive Validators and their nodes Inactive Validator Nodes. The lifecycle of a Validator A complete overview of the network including different kinds of nodes has been presented on the diagram below: Radix Network overview Next: Node Installation and Basic Setup (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-setup/README.md) ## Running Infrastructure URL: https://radix.wiki/developers/legacy-docs/run/running-infrastructure Updated: 2026-02-18 Summary: The infrastructure that underpins the Radix Network comes from a decentralized community that runs two types of software: Run > Running Infrastructure — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/running-infrastructure.md) The infrastructure that underpins the Radix Network comes from a decentralized community that runs two types of software: A Core Node is a direct participant in the Radix Network. The node can be deployed in different configurations to perform the role of a validator, synchronize the state of the network to offer data to clients, or push new transactions to the network for processing. The node exposes the Core API for reading low-level ledger state and submitting transactions. You can find more about running a node in the Node guide (https://github.com/gguuttss/radix-docs/blob/master/run/node/README.md) . A Network Gateway does not directly connect to the Radix Network, but makes use of one or more Core Nodes and uses the Core API to interface to the network. A Gateway acts as a point of entry into the network for typical clients, serving requests about the ledger state and facilitating easy transaction building and submission onto the network. The Radix Wallet and Radix Dashboard connect to Gateways run by the Radix team that will also serve other requests publicly (at limited rate). But anyone may run a Gateway, and application developers are encouraged to do so to service their own application needs in production. You can find more about Network Gateways in the the Network Gateway guide (https://github.com/gguuttss/radix-docs/blob/master/run/network-gateway/README.md) . ## Run URL: https://radix.wiki/developers/legacy-docs/run/run Updated: 2026-02-18 Summary: Legacy documentation: Run Run — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/run/README.md) ## Patterns for Application Deposit Use Cases URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/account-deposit-patterns Updated: 2026-02-18 Summary: A common desire of blockchain applications is to send tokens to users – whether from dApp smart contract or backend system. Maybe you're building a fully featur Build > Scrypto > Design Patterns > Patterns for Application Deposit Use Cases — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/account-deposit-patterns.md) A common desire of blockchain applications is to send tokens to users – whether from dApp smart contract or backend system. Maybe you're building a fully featured dApp, maybe you're an exchange, or maybe you just want to do a mass airdrop. On most blockchains, you have one solution: call the transfer method of the token smart contract with a recipient address. Radix's asset-oriented approach means doing things differently, but also provides more power and control for you and your users. Before You Continue Some important things to understand up-front about Radix: - All tokens and NFTs on Radix are native "resources", not smart contracts (https://www.radixdlt.com/blog/its-10pm-do-you-know-where-your-tokens-are) - Sending them means passing a bucket from sender to recipient – not calling a token smart contract - Accounts on Radix are native "components" (i.e. specialized smart contracts) (https://www.radixdlt.com/blog/how-radix-multi-factor-smart-accounts-work-and-what-they-can-do) - Sending to an account means passing a bucket of resources to a deposit method on it - The account owner can always deposit to their own account (by signing the transaction) - Third parties must use special deposit methods. The user can configure what resources can be deposited by third parties, and who can do so. For example, deposits of unrecognized resources can be turned off, enabling what is often called "no airdrop mode" - Don't treat a Radix user and an account address as the same thing - The Radix Wallet lets users use multiple accounts with dApps, and has a separate "Persona" system to identify themselves - Users may want to use different accounts at different times - "Badge"-based auth on Radix provides a powerful system for authorization that isn't account-driven These unique Radix differences enable a much better experience for users, but as a developer, you might wonder how some use cases are handled. You might be asking: What do I do if I want to airdrop or do a payout to an account that doesn't allow third party deposits? Or: How can I send tokens to users without using a fixed account address? This page explains some simple Radix patterns you can use for deposits, depending on your use case. Deposit Patterns for Specific Use Cases There are a huge variety of circumstances where applications need to send tokens to users, such as: - Returning tokens to a connected user from a DeFi transaction - Airdropping tokens to a large number of accounts with unknown owners - Withdrawing tokens from an exchange to a private account - Regularly and automatically allocating royalty payments or fees to users - Sending a "soulbound" NFT-based credential to a user The solutions to these and other use cases on Radix are not difficult, but require a little bit of reorientation of thinking away from the typical "send to address" crypto way today. A useful way to think about the problem is this: In the real world, what are the different ways that a business sends a package to a customer? There isn't a single method. Is the customer standing in front of you? Do they accept packages at their mailing address? Do they have an existing relationship with your business? This "physical package" metaphor is one we'll use in the patterns below to help make them more understandable and intuitive. The various deposit patterns in this document are useful in different situations and for different application needs. The flow chart below summarizes when the various deposit patterns are used. Each of these deposit patterns has a section in this document that dives deeper into them, their properties, how they're used, and their advantages. The "Online" User-signed Deposit Pattern The simplest case is the one of a user actively interacting with your application, where tokens need to be returned to the user at that point in time. For example, they are doing a swap with your DEX dApp and the result of the trade needs to be returned to them. To use the "physical package" metaphor, you don't need to send anything through the mail because your user is standing right in front of you – you just give them the package directly, and they can put it wherever they like. How to use the User-signed Deposit pattern In this pattern, your on-ledger dApp component's methods only need to return bucket of tokens. The component shouldn't try to decide where those tokens should be deposited. For example, a DEX takes some tokens to be swapped as input, and simply returns the bucket of resulting tokens. (This also makes the component design highly composable!) To continue the example, the DEX application's frontend will then build a transaction manifest to do the DEX swap and deposit the results. This will include withdrawing the tokens to be traded away from the user's account, passing them to the DEX component, and then depositing the trade's results to the user's account. The DEX frontend probably has gotten the user's preferred account to be used from the Radix Wallet, and used that selection to determine what account to use for the withdraw and deposit in the transaction manifest it builds. In this case, your transaction manifest can confidently use the account component's deposit method directly (https://www.radixdlt.com/blog/using-native-accounts-to-interact-with-your-users-assets) – because the user themself will be signing and submitting the transaction to the network in their Radix Wallet. Because they are signing it, they have the authority to deposit anything they like to their own account. More difficult, however, are the situations where your user is "offline" and you wish to send them some tokens. To use the "physical package" metaphor, the recipient isn't standing in front of you and you somehow want to get the package to them through the mail. That's where the following patterns come in… "Offline" Send-to-User Application Patterns To send tokens to a user from your components or backend system directly – not via a transaction that the user signs while connected, as above – there are three patterns that are recommended for developers, depending on the situation. Development Pattern "Package" model description Use Cases Badge and Claim Pattern Come pick up your package from us directly at our place of business. Most standard dApps, with Scrypto components that need to send tokens to users while they are "offline" Account Locker Pattern We'll try to send the package to your house, but if we can't, you'll have to come pick it up. "Fire and forget" sends to an account address, like mass airdrops or exchange withdrawals Authorized Depositor Pattern Give us a permission to make a special delivery directly to your house. Sending a "user badge", or doing continuous distributions from a component or backend system the user trusts The Badge and Claim Pattern This is the pattern to use for the great majority of normal decentralized application and backend transfer needs. The situation: You have a relationship with a user beyond a single transaction. Often the user is logging in to your web frontend. You want to send tokens to this user for a variety of reasons - either during an active "session" with the user, or asynchronously when they are offline. You may have specific rules that need to be met before the user can receive the tokens. If the tokens are like a physical package, in the badge and claim pattern, you will be telling the user "come pick up your package from us directly at our place of business". How to use the Badge and Claim pattern On a typical blockchain, you might register a particular address for the user and have your logic transfer tokens to it. But of course on Radix the user's account may reject that deposit of tokens - and then what do you do? Also, it's much better if you allow your user to use whatever accounts they want to with your application over time, rather than tying them to a single one. Instead it is recommended that your application use this process: - Identify your user not by account address, but by a unique NFT "badge" that they hold. This badge could be a persistent non-transferable "user badge" (see pattern #3), or it might be more like a single-use "claim ticket". - When you have tokens to send to your user, store them within your own component in a vault keyed to that unique badge. - In your frontend, let the user "claim" the tokens at their own convenience. Your frontend builds a transaction in which a proof of their user badge is produced and passed to your component – or the badge itself is "turned in", in the case of a single-use claim ticket. Your component can examine that proof/badge to identify the user and return the correct tokens. - The claim transaction can then deposit to any of the user's accounts, as in the "online" pattern above, since the user themself signs and submits the transaction. ( More about this pattern can be found here (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/user-badge-pattern.md) .) However, you might be thinking "if I use a non-transferable user badge, how do I send that badge to the user and ensure it reaches the right person?" Good question: see pattern #3 below. Advantages - You don't have to continually care about the third party deposit settings of your user's accounts. Even if they allow no third party deposits, they can still claim their tokens from you. - You can allow your user to use whatever accounts they want with your application, whenever they want. As long as they can produce the proof of the user badge, they can claim their tokens. - The user badge may also be useful to gate access to certain features of your dApp's components. - You may include other rules associated with the ability to claim. Perhaps the user has to produce proof not only of a user badge, but a KYC badge issued by another application (or even use that badge as the user badge itself). Perhaps you want to limit the time period during which they can claim. etc. Maybe you want to issue specific claim ticket assets for particular deposits. The Account Locker Pattern Important Note The Account Locker blueprint (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/locker.md) arrived in Jun 2024 with the Bottlenose protocol updated to the Radix Network, but the Radix Wallet functionality described here isn't yet available. In the meantime, dApps will need to propose the transaction for the user to claim from the locker through the dApp's website. This is the pattern to use if you have an absolute need to conduct "no fail" deposits to particular account addresses. Exchanges or bulk airdrops are the classic examples. The situation: You may have no knowledge or relationship with the user other than the account address, and you simply want a fire-and-forget solution to distribute tokens. If the tokens are like a physical package, the account locker pattern here is: "We'll try to send the package to your house, but if we can't, you'll have to come pick it up." How to use the Account Locker pattern The Radix network includes a native component called an Account Locker (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/locker.md) which is specially designed to enable this pattern. In short, you instantiate a locker for your dApp and can use it to hold deposits for particular account addresses that may be claimed exclusively by the account owners. If you wish you can have the locker attempt to directly deposit the assets to the account, and only hold them in the locker if this isn't possible. Basically the locker kind of acts like an "Amazon pick-up locker" that you can notify the user of, and ask them to claim from. If you link your locker to your dApp Definition, and the user's wallet already trusts your dApp, then the user may be notified that there is something from your dApp waiting to be claimed and do the claim from the wallet directly. Use the following process: - Instantiate a single locker component for use by your dApp. You never need more than one. - (Recommended) Configure the verification metadata (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) on the locker to include a dapp_definitionset to your dApp Definition address. In that dApp Definition, add the locker address to the claimed_entities list. This provides a verified link between the locker and your dApp for use by the Radix Wallet. Also add the locker address to an account_locker field on the dApp Definition to let the Radix Wallet know that it might check this address for available claims. - When you wish to do a send of tokens to addresses, use the locker's send method. This will take an account address, resources to be sent, and a flag that control whether the method should first attempt to deposit the resources into the account or not. - If the method is invoked with the try_direct_send flag set to true the locker will attempt to immediately deposit the resources in that account, but if that is not possible (due to account deposit settings) it will store those resources for later claim by that account's owner. - If the method is invoked with the try_direct_send flag set to false the locker immediately keeps the resources for claim by the account owner without attempting to directly deposit. This may be used as a fallback if your own system wishes to attempt the direct deposit itself, or if you prefer to have all users receive their resources by claiming them - If you did step 2: The Radix Wallet will automatically watch the locker associated with your dApp, if it has previously connected to that dApp's frontend. If there are any resources waiting for claim by accounts controlled by that user's wallet, the wallet will display a simple alert that there are tokens from your dApp waiting to be claimed, and they will be able to claim them directly from their wallet with a tap. - If you did not do step 2, or if the user has never connected to your dApp: You will likely need to create a simple web UI where the user can connect their wallet and make their claim, and you will need to inform the user that there may be a claim waiting for them. For example, a mass airdrop might simply broadcast on social media that users should visit your website and connect their wallet to see if they were a recipient of the airdrop. Advantages This provides a good solution for very "one time" sends to particular accounts. It's also good for sends where the application needs to fire off the send and have it be "off the books" - the user can pick up the deposit or not, but the dApp can forget about it. In short, while it isn't recommended for applications with an ongoing relationship with the user (where the Badge and Claim pattern may be better), this pattern provides a close replacement for the typical crypto "send to address" option. However keep in mind that it will not work for non-transferable tokens, leading us to the final pattern… The Authorized Depositor Pattern This is the pattern to use in the very particular narrow circumstances where you must make a direct deposit to a particular account, but where the Account Locker pattern is unsuitable. The primary use case here is sending a user a non-transferable ("soulbound") token to a particular account. In many cases this badge might be a user badge (used with the claim pattern) or other type of KYC or credential badge. And typically the user has provided proof that they own a particular account and the non-transferable token must be deposited into that particular account. Perhaps the user provided a ROLA proof of an account address to your backend system with their Radix Wallet, and now you wish to send a user badge to that account for future use. This pattern might also potentially be suitable for exchanges as an alternative to the Account Locker pattern where the user demands direct deposit to their account while they are offline. The "physical package" pattern here is: "Give us a permission to make a special delivery directly to your house." How to use the Authorized Depositor pattern All Radix accounts give the owner the ability to configure their own preferences for what tokens (resources) may be deposited to that account by an unknown third party. ( See documentation on the account component (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) ) If you wish to make a direct deposit of a non-transferable token to an account, your system can first ask a Radix Gateway or Node, via API (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) , if deposit of the token will be allowed by its settings. This will include checking if either the default account deposit mode or the resource preference map would block the deposit. If the answer is that the deposit is okay, go ahead and make the deposit using one of the try_deposit methods and you're done! If however the account will not allow the direct deposit, you can ask the user to allow your particular application to be able to deposit to that account by adding you as an "authorized depositor". Use the following process: - Create a resource to use as a depositor badge. Store this badge within the dApp component that will be conducting the direct deposit to the user's target account. (This can be a single badge used for all users of your system.) - Submit a transaction request to the user's wallet (https://github.com/radixdlt/radix-dapp-toolkit?tab=readme-ov-file#transaction-requests) for a transaction that adds your dApp's badge to the target account's authorized badge list. The user will be presented with an understandable summary of this transaction explaining that you are requesting this addition to their account's rules. You may wish to inform the user in your UI that you are making this request, and why. You may even want to make this request part of the "user onboarding" flow for your dApp after they have logged in with their wallet (https://github.com/radixdlt/radix-dapp-toolkit?tab=readme-ov-file#login-requests) . - Once this transaction is submitted by the user, your application may present proof of its badge in any transaction your system submits that uses a try_deposit call to the user's account. Because your badge is on the authorized depositor list, the deposit will succeed regardless of any other rules set on the account. Advantages Once the user has approved your depositor badge, your application has free reign to deposit to that account - just like on a typical crypto network (although using the try_deposit method of the Radix account component). However, it is still encouraged to use the "claim pattern" rather than relying on this pattern completely. By using this pattern to send your user a user badge, you allow the user to use different accounts with your application in the future and you avoid having to continually submit new depositor badge transactions to the user whenever you wish to deposit to a different account. ## Reusable Blueprints Pattern URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/reusable-blueprints-pattern Updated: 2026-02-18 Summary: This may be looked at as a best-practice to blueprint developers hoping to benefit from developer royalties, rather than a design pattern. Non the less, this is Build > Scrypto > Design Patterns > Reusable Blueprints Pattern — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/reusable-blueprints-pattern.md) This may be looked at as a best-practice to blueprint developers hoping to benefit from developer royalties, rather than a design pattern. Non the less, this is an important best practice and warrants its place here. When Babylon (https://learn.radixdlt.com/article/what-is-the-radix-roadmap?_gl=1*dofazv*_ga*MTIzMTQzODAzMS4xNjgxNDA1NDI2*_ga_MZBXX3HP5Q*MTY5NDUzODQwNS4xOTMuMS4xNjk0NTM4NDA1LjYwLjAuMA..) comes, blueprint developers will have the ability to publish their blueprints to the blueprint catalog and impose royalties on them. Developers hoping to attract a large amount of other developers to use their blueprint are encouraged to ensure that their blueprints are reusable such that they cater to a wide range of developers. Ensuring that your blueprints are general and reusable is not something that is done in a single step. There isn’t a "reusable" switch that you can flick and have your blueprints be reusable; rather, it comes as a number of small steps which when carried out as a collective, you get general reusable blueprints which fit the needs of a wide array of developers. Multiple Instantiation Functions A common approach which is typically seen in many blueprints is this concept of an instantiation function which creates an admin badge and assumes this as the admin. Then, when a caller presents this badge, they are authorized to call administrative methods on the component. While this works well when the administrator is a single person or entity, this does not lend itself very well when the instantiator of your blueprint is a complex organization which requires the ability to set components up which have complex access rules. Consider an organization with the above structure which wishes to utilize your blueprint to instantiate a new component. While the single admin badge worked well under the assumption of a single admin, it does not lend itself well at all to such a complex organization structure. Very quickly, many questions begin popping up: who should be given the admin badge? should the admin badge be mintable to allow for the additional people to be included? How to handle the complex auth required here with the "and" and "or" logical operations? It is clear that there is a need for both: simple ways of instantiating components which creates the badges and all of the required resources, as well as ways which allow users with complex authorization needs to be able to fullfil their needs. Therefore, it is recommended that blueprints implement two (or more) instantiation functions which are aimed at two main groups: - Simple instantiation functions: these are instantiation functions which create all of the required badges, resources, and components on behalf of the caller and then return everything to them. This type of instantiation functions are convenient to use but liming as they do not provide a lot of flexibility as to how access rules will be set. - Custom instantiation functions: this is an entire class of instantiation functions which allow the instantiator to setup their own access rule(s) on the components that they are instantiating; thus, giving them the flexibility to set rules that fit within the context in which they are instantiating the component. Examples: - Payment Splitter (https://github.com/radixdlt/scrypto-examples/tree/main/core/payment-splitter) Small Modular Reusable Blueprints This best practice is not unique to Scrypto, it is a universal best practice which you will encounter in almost all programming languages in all fields. When working on a Scrypto package, its important to notice repeating patterns which could be refactored into their own blueprint, or even better, notice which pieces of your logic can be replaced by components of a blueprint already available in the blueprint catalog. When developing a system, it might be tempting to put everything into a single blueprint which would manage the system and would essentially be the system. This is approach is typically faster when you are first developing a system prototype, however, in no time, bugs will find their way into such a system and the process of maintaining it would be very difficult as it would very quickly become harder to reason about and harder to debug due to the sheer amount of moving pieces. Instead of making your application a single large blueprint, it is recommended that you use the blueprint-component concepts in Scrypto to develop small blueprints which are used by other blueprints. Each blueprint would establish a good separation of concerns where each blueprint would have one single task that it performs well. When other blueprints need to perform this task, they simply instantiate a new component from this blueprint and make use of it. This approach comes with a wide array of advantages, such as: - Smaller blueprints are typically more secure by virtue of being easier to reason about and thus many errors are usually found while writing the blueprints. - Writing unit tests for a smaller blueprint is easier than larger blueprints. In smaller blueprints, it’s easier to see what the edge-cases are that need to be tested for and since the blueprint is smaller, you can write tests for all of the edge-cases that come to mind. - Building a larger blueprint on top of a number of smaller, well-tested blueprints provides the developer with the guarantee that the blueprints that they are using are doing exactly what they believe that they do. It alleviate the burden of testing every single portion of the larger blueprint as the smaller blueprints being uses have already been tested. - In almost all cases, adding features to a smaller blueprint will be easier than a larger blueprint. This comes down to a very important point: smaller blueprints are easier to reason about and their edge-cases can easily be seen. - You might have developed your blueprints thinking that you would be the only user of them. However, you might quickly find other developers instantiating components from your blueprint to use in their projects. Your smaller, more modular blueprint would allow you to benefit from Radix’s developer royalty system. Of course, to every rule there is an exception. The application that you are building might be a small application where splitting the application across multiple different blueprints would not make sense or would lead to great complexity. These cases do exist, in this case, it is perfectly fine to have all of your logic implemented in a single blueprint. ## Transient Badge Pattern URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/transient-badge-pattern Updated: 2026-02-18 Summary: The class of problems that this pattern solves is quite simple to describe but complex to think about and solve. However, this pattern solves this class of prob Build > Scrypto > Design Patterns > Transient Badge Pattern — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/transient-badge-pattern.md) The class of problems that this pattern solves is quite simple to describe but complex to think about and solve. However, this pattern solves this class of problems in such a simple, elegant, and unique asset-oriented approach. As always, this document begins by laying out the problem before describing the pattern and how it solves the problem. Say we would like to build a flash loan application where borrowers first need to call take_loan() on your component and then call repay_loan() when they wish to return the funds. The application is to have two two main, key requirements that it must include: - The general security requirements typically present in any flash loans application, must be satisfied. As in, the loan must be returned back in full plus any interest rate imposed on the loan. - The application must return the loan to the transaction worktop so that the borrower may use the funds freely throughout the transaction. The two requirements above might seem to contradict one another. How can we ensure that the borrower is able to borrow funds to their transaction worktop all while ensuring that the borrower will—​at some point in the transaction—​call repay_loan() to pay back their loan in full? Obviously, this can not be something that we trust the user to perform. But how can we ensure that the funds must come back to our component? A solution which satisfies the first requirement but not the second is to have the take_loan() method take in a ComponentAddress and a String of the address of the component to make calls to and the name of the method to call respectively. In the take_loan() method, a call will be made to the specified method on the specified component. We then check that the result of the call is a bucket, check the amount in the bucket, and determine if the loan was paid back or not. While this gets us a basic flash loans application, this approach is not composable at all. So, this leaves us back at square one, how do we move forward from here? Transient Resources A transient resource is a resource which can not be withdrawn or deposited into any vault. It can be minted by some authority, burned by some authority, but not deposited or withdrawn. The powerful checks that the Radix-Engine performs on transactions, combined with the characteristics of transient badges, allow for this problem (the flash loans problem) to be solved in a simple, elegant, and unique asset-oriented approach. In this application, the transient non-fungible token acts as the terms of the loan. It includes data on how much was taken as a loan, how much we expect to get back, what interest rate was the loan given at and what tokens we expect to get back. The transient resource pattern—​in the context of the flash loan example—​can be implemented as follows: - When the borrower calls take_loan(), they are given the funds that they wish to borrow as well as a transient non-fungible token which contains information on how much tokens they wanted to borrow. The transient non-fungible badge can not be deposited into any vault as its transient. Thus, if the borrower acts maliciously and decides to not return the tokens, they would be stopped by the radix-engine as they would be able to deposit the tokens into their vault, but would not be able to deposit the transient token into their vault. Thus, leading the transaction to fail—​with a ResourceCheckFailure--as the transient badge would remain in the transaction worktop. Transactions in Radix work on all-or-nothing-basis. Meaning, if the entirety of the transaction does not succeed, then the entirety of the transaction will fail. Thus, even if they had "deposited" the borrowed tokens into their vault, the transaction failure—​lead by the transient badge not being deposited—​would essentially "reverse" the history and bring the state back to before the tokens were deposited. - Once the funds are in the borrower’s transaction worktop, they can do with them as they wish. They can send them to somebody else, they can exchange them on a decentralized exchange, or they can even deposit the borrowed funds into their own account component. However, by the end of the transaction if they do not pay it back, the transient badge will remain in their transaction worktop which will cause the transaction to fail. For the transaction to succeed, the funds must be returned back to the component by calling the repay_loan() method. Just like take_loan() gave the caller the funds and the transient badge, repay_loan() takes the funds and the transient badge back. Once the method gets the funds and the transient non-fungible tokens back, it checks the data on the transient badge to ensure that the loan was paid back in full in the correct expected token. Only when this happens would the transient badge be burned and the transaction allowed to continue. use scrypto::prelude::*; #[derive(NonFungibleData, ScryptoSbor)] pub struct LoanDue { pub amount_due: Decimal, } #[blueprint] mod basic_flash_loan { struct BasicFlashLoan { loan_vault: Vault, transient_resource_manager: ResourceManager, } impl BasicFlashLoan { pub fn instantiate_default(initial_liquidity: Bucket) -> Global { // - snip - // Define a "transient" resource which can never be deposited once created. let transient_token_manager = ResourceBuilder::new_ruid_non_fungible::(OwnerRole::None) .metadata(metadata! init { "name" => "Promise token for BasicFlashLoan - must be returned to be burned!", locked; } ) .mint_roles(mint_roles!( minter => rule!(require(global_caller(component_address))); minter_updater => rule!(deny_all); )) // #1 .burn_roles(burn_roles!( burner => rule!(require(global_caller(component_address))); burner_updater => rule!(deny_all); )) // #1 .deposit_roles(deposit_roles!( depositor => rule!(deny_all); depositor_updater => rule!(deny_all); )) // #1 .create_with_no_initial_supply(); Self { loan_vault: Vault::with_bucket(initial_liquidity), transient_resource_manager: transient_token_manager, } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } pub fn take_loan(&mut self, loan_amount: Decimal) -> (Bucket, Bucket) { assert!( loan_amount <= self.loan_vault.amount(), "Not enough liquidity to supply this loan!" ); // Calculate how much we must be repaid let amount_due = loan_amount * dec!("1.001"); // #2 let loan_terms = self.transient_resource_manager .mint_uuid_non_fungible( // #2 LoanDue { amount_due: amount_due, }, ); (self.loan_vault.take(loan_amount), loan_terms) } pub fn repay_loan(&mut self, loan_repayment: Bucket, loan_terms: Bucket) { assert!( loan_terms.resource_address() == self.transient_resource_manager.resource_address(), "Incorrect resource passed in for loan terms" ); // Verify we are being sent at least the amount due let terms: LoanDue = loan_terms.as_non_fungible().non_fungible().data(); assert!( loan_repayment.amount() >= terms.amount_due, "Insufficient repayment given for your loan!" ); self.loan_vault.put(loan_repayment); // We have our payment; we can now burn the transient token loan_terms.burn(); // #3 } } } - Since this is a transient non-fungible badge, it can be minted and burned, but can not be deposited into any vault by anybody, including the admin badge. - Minting a transient non-fungible token which contains the loan terms. - With the token being paid back in full, the transient resource may now be burned to allow the transaction to continue. This pattern is very powerful and allows for a large degree of control over the transaction by the blueprint creator. The general rule of this pattern is: you can use this pattern when you wish for two or more methods to be called in a specific order—​defined by you, the blueprint developer—​in the same transaction. The above diagram showcases how this pattern can be used to ensure that n methods will be called in the same transaction and how the different transient badges involved will be handled. Each method takes in the transient badge produced by the method before it, burns it, and creates a new transient badge which may only be burned by the method that follows it. If the transaction does not get to the last method, it fails due to the characteristics of transient resources. If the transaction gets to the last method, then the last transient badge is burned, and the caller is allowed to proceed freely. Examples - Basic Flash Loan (https://github.com/radixdlt/scrypto-examples/tree/main/defi/basic-flash-loan) ## The Withdraw Pattern URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/the-withdraw-pattern Updated: 2026-02-18 Summary: This pattern builds upon the user badge pattern(user-badge-pattern.md) and could be considered as one of its use cases. The problem which this pattern solves is Build > Scrypto > Design Patterns > The Withdraw Pattern — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/the-withdraw-pattern.md) This pattern builds upon the user badge pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/user-badge-pattern.md) and could be considered as one of its use cases. The problem which this pattern solves is the problem of getting funds from component A to component B. Similar to the badge pattern, if you come from the Ethereum world, you might assume that "sending" funds to the address of the component is the way to go here. However, it is not. Ethereum’s approach leads to reentrancy (https://quantstamp.com/blog/what-is-a-re-entrancy-attack) attacks which are often triggered by the contract’s fallback function. In addition, Ethereum’s approach allows contracts to receive funds which they did not accept to receive. While this might seem like a non-issue when thinking about it in the context of accounts, the real issue becomes clearer when considering the fact that anybody can send funds to a liquidity pool to tip the balance of the pool without the pool having any say on whether it wishes to accept or reject such funds. While the direct transfer of funds is possible to perform in Scrypto by calling the appropriate methods on the recipient’s component, its heavily discouraged to perform deposits in this way from the inside of your blueprints. Before going with the hard-coded approach of "sending" funds to a component, ask yourself the following questions: - How can I be certain that the recipient’s component address corresponds to an account component and not some other component? - If the recipient’s component is not an account component, then, does it have methods such as "deposit" to allow for direct deposits into the component? - If the recipient’s component does not have a "deposit" method, then how do I determine the appropriate methods to use for the deposit from inside the my component? - How can I determine if the recipient’s component has a vault for the type of tokens which I will be sending. - Is it desirable for my blueprint to only work with account components and fail when any other component is being used? - What if my user wishes to use the withdrawn funds atomically in a transaction instead of having the funds sent to their account? Remember, there is a strong use case for allowing your users to do so. Clearly, the address-based approach of managing funds raises too many questions, and would in all likelihood, result in a number of assumptions being made. These hard-coded assumptions could prove catastrophic in edge-cases where they do not apply, which could result in funds getting locked in the component forever. Locking and Withdrawing Funds The recommended pattern to handling funds which are owed to another component is by issuing badges for the different entities in your system and keeping track of owed funds in your component. You can then write methods which can withdraw the funds owed to the caller after authenticating the caller’s badge to ensure that they’re a valid entity in the system. Imagine you are building a lottery blueprint. You want participants to send XRD to a vault which, later, the winner will receive their prize from. You might think of including a method distribute_funds which, when called by an admin, moves the prize directly to the winner’s wallet. There are multiple problems with this approach. First, this is not really composable. What if the user wanted to do something with the prize tokens in the same transaction ? Second, what if the user provided an address that is not a wallet ? The funds would be locked in the component. Instead, you should create a withdraw method where users present their badge and if they are the winner, the funds gets returned: use scrypto::prelude::*; #[derive(ScryptoSbor, NonFungibleData)] struct TicketData { minted_on: Epoch } #[blueprint] mod lottery_example { struct LotteryExample { xrd_vault: Vault, ticket_resource_manager: ResourceManager, // Here we keep track of the user's NFT badge id winner_id: Option } impl LotteryExample { pub fn create_lottery(&mut self) -> Global { let (address_reservation, component_address) = Runtime::allocate_component_address(Runtime::blueprint_id()); // Create a badge to identify this user let ticket_badge_manager = ResourceBuilder::new_ruid_non_fungible::(OwnerRole::None) .metadata(metadata!( init { "name" => "Lottery Ticket", locked; } )) .mint_roles(mint_roles!( minter => rule!(require(global_caller(component_address))); minter_updater => rule!(deny_all); )) .create_with_no_initial_supply(); Self { xrd_vault: Vault::new(XRD), ticket_resource_manager: ticket_badge_manager, winner_id: None } .instantiate() .prepare_to_globalize(OwnerRole::None) .with_address(address_reservation) .globalize(); } pub fn enter_lottery(&mut self, xrd: Bucket) -> Bucket { self.xrd_vault.put(xrd); let ticket_badge = self.ticket_resource_manager .mint_ruid_non_fungible( TicketData { minted_on: Runtime::current_epoch() } ); return ticket_badge } // After the winner was picked, they // can withdraw the funds pub fn withdraw(&mut self, ticket_badge: Proof) -> Bucket { let ticket_badge = ticket_badge.check( self.ticket_resource_manager.address() ); let nft: NonFungible = ticket_badge .as_non_fungible() .non_fungible(); assert!( self.winner_id.as_ref().unwrap() == nft.local_id(), "You are not the winner" ); self.xrd_vault.take_all() } } } This approach of handling the distribution of funds brings with it a number of advantages, such as: - No one person ends up paying the fee for the sending of tokens (although this is possible to do if they want to). - Removes the burden of figuring out how the funds can be sent. - This approach allows for the blueprint to work with account and non-account components which may or may not expose similar methods for deposits. - Ensures atomic composability. Examples - Radix Name Service (https://github.com/radixdlt/scrypto-examples/tree/main/core/radix-name-service) - Payment Splitter (https://github.com/radixdlt/scrypto-examples/tree/main/core/payment-splitter) - Vesting (https://github.com/radixdlt/scrypto-examples/tree/main/core/vesting) ## Actor Virtual Badge Pattern URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/actor-virtual-badge-pattern Updated: 2026-02-18 Summary: When we’re writing our blueprints we’ll eventually come to a point where we need to think about how a component can be given authority to perform some kind of a Build > Scrypto > Design Patterns > Actor Virtual Badge Pattern — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/actor-virtual-badge-pattern.md) When we’re writing our blueprints we’ll eventually come to a point where we need to think about how a component can be given authority to perform some kind of action. The user badge pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/user-badge-pattern.md) introduces concepts of how to use resources as "badges" to provide authority for certain actions. However, when it comes to enabling components to perform authorized actions, the component itself does not need to have its own badge to provide permission and perform authorized actions. The component itself has the ability to provide the necessary proofs on its behalf to perform a permissioned action. This can be done with the actor virtual badge pattern and it has two common use cases: Resource Behavior Permissions It’s often a common pattern for to have some variation of resource behaviors which needs an autonomous entity to permit it from performing those resource action. This can be the ability to mint a user badge every time a "create_user" method is called, burn a resource, or otherwise. This kind of pattern perfectly suits to employ an actor virtual badge to provide the component the permission to perform such resource action. Let’s take the contrived blueprint example below: use scrypto::prelude::*; #[blueprint] mod rad_social { struct RadSocial { user_badge_manager: ResourceManager, } impl RadSocial { pub fn instantiate_rad_social() -> Global { let (address_reservation, component_address) = Runtime::allocate_component_address(RadSocial::blueprint_id()); // #1 let user_badge = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "User Badge", locked; } )) .mint_roles(mint_roles! { minter => rule!(require(global_caller(component_address))); // #2 minter_updater => rule!(deny_all); }) .create_with_no_initial_supply(); Self { user_badge_manager: user_badge, } .instantiate() .prepare_to_globalize(OwnerRole::None) .with_address(address_reservation) // #3 .globalize() } pub fn create_user(&mut self) -> Bucket { // #4 self.user_badge_manager.mint(1) } } } - We allocate an address_reservation and component_address for the component to create its own caller badge. - We specify the resource’s minter role with an AccessRule to use the component_address to create a caller badge. The global_caller function conveniently allows the component to create its own virtual actor badge to present proof that it has permission to perform such action. - Allocating the component’s address requires us to instantiate the component .with_address(address_reservation) to solve for the chicken and egg problem of knowing the ComponentAddress before it has been created. - The component can now freely mint user badge every time the method is called. ## User Badge Pattern URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/user-badge-pattern Updated: 2026-02-18 Summary: The problem that this pattern solves is the problem of how to tell that somebody is an admin in your system, or more generically, how to tell that a user has th Build > Scrypto > Design Patterns > User Badge Pattern — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/user-badge-pattern.md) The problem that this pattern solves is the problem of how to tell that somebody is an admin in your system, or more generically, how to tell that a user has the authority to perform a certain action in the context of your system. If you come from the Ethereum world, your first intuition might be to use the caller’s address to perform authorization checks. If the caller’s address is in the list of whitelisted addresses then their call is valid, otherwise they’re not authorized to perform the call. The address-based approach, however, is far from being optimal as it introduces issues with composability and the smallest of mistakes in implementation could have the biggest of consequences, to the extent of privileged users losing access. The address-based approach to authorization has the following problems: - It assumes a one-to-one mapping of public keys to account addresses, as in, this approach assumes that one key-pair can only control one account. While this is true on Ethereum, it is not true on Radix where an account is a component and a single key-pair can have control over multiple accounts. Thus, authorizing one of my owned component accounts to perform an action while not authorizing the others feels weird and clunky. - It requires that additional methods are implemented on the smart contract to allow for additional authorized or whitelisted addresses to be added, removed, replaced, and so on. Failure to implement such functions correctly could mean that the contract "owner" is forever locked and may never be changed. This proves to be an issue when looking at the possibility of accounts being hacked, a dApp getting sold, or if the contract’s "owner" wishes to add other admins with them in a contract which used to be single-admin. - Unless the required functions are present on the contract, this approach mandates that the contract’s "owner" must never migrate to a new account—​not even in the case of a hack—​as their authority to perform privileged actions on a contract is directly linked to their account address. - Most implementations of the Ownable contract on Ethereum assumes that the contract creator is the contract’s owner. This would mean that in the case that the caller was using a proxy which they no longer have access to, then the proxy contract would be considered the "owner" of the contract. - In Ethereum, there may only be one contract caller at a time, as Ethereum has no native support for multi-signature transactions. This means that authorization logic which involves multiple-signatures is a lot harder to correctly implement. With the above points in mind, using the caller’s address to perform authorization checks begins to seem less intuitive and moves from being a simple solution to implement, to a design decision which requires a lot of thought in terms of what the system’s current and future requirements. Using Badges for Authorization Radix solves the problems with address-based authorization techniques through the concept of badges. A badge is any normal resource—​meaning that it may be fungible or non-fungible—​which components use to perform authentication, and authorization checks. Components may be setup such that they require that some badge is present to allow certain method(s) to be called, then only when the caller presents a valid badge, through the auth zone or by passing it by intent, may they be authorized to call such methods. Let’s look at a Scrypto example of how badges may be used for the purpose of authorization. Say that we are developing some blueprint where there is a vault that only admins can withdraw funds from. To implement this pattern, we would make it so that when our component is instantiated, it creates an admin badge, sets up the access rules to limit withdrawals to only admins, then returns the admin badge back to the caller. This would look as follows in Scrypto: use scrypto::prelude::*; #[blueprint] mod dontaions_box_module { enable_method_auth! { methods { withdraw => restrict_to: [OWNER]; // #1 } } pub struct DonationsBox { donations: Vault } impl DonationsBox { pub fn instantiate_donations_box() -> (Global, Bucket) { // Create the admin badges let admin_badge: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) // #2 .divisibility(DIVISIBILITY_NONE) .metadata(metadata!( init { "name" => "Admin Badge", locked; } )) .mint_initial_supply(1); // Instantiate the component let component = Self { donations: Vault::new(XRD) } .instantiate() .prepare_to_globalize( OwnerRole::Fixed(rule!(require(admin_badge.resource_address()))) ) // #3 .globalize(); // Return the component address and the admin_badge ( component, admin_badge ) } pub fn withdraw(&mut self) -> Bucket { // This method can only be called if the caller presents an admin badge self.donations.take_all() } } } - Calling enable_method_auth!` to restrict the withdraw method to the component OWNER. - Creating the admin badge which will be used for authorization checks throughout the component. - Mapping the AccessRule for the OWNER role to the admin badge. ## Design Patterns URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-design-patterns/scrypto-design-patterns Updated: 2026-02-18 Summary: Legacy documentation: Design Patterns Build > Scrypto > Design Patterns — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/README.md) ## Scrypto Test URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/testing/scrypto-test Updated: 2026-02-18 Summary: This document provides a description and guidance on writing tests using the scrypto-test library. Build > Scrypto > Testing > Scrypto Test — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/testing/scrypto-test.md) This document provides a description and guidance on writing tests using the scrypto-test library. The library provides two frameworks to help you test your blueprint code: - TestEnvironment - Allows developers to call their Scrypto functions and methods directly and assert on the output - SimulatedLedger - Allows developers to publish Scrypto code to a simulated ledger and run various test transactions to check the outcomes Both TestEnvironment and SimulatedLedger will be useful throughout your blueprint development journey. The former is useful for unit testing a specific function or method, without needing to think about costing or auth.  The latter is useful for integration testing your blueprint, with a close-to-production environment, where all limits and restrictions are applied. Import scrypto-test into your project Before you can use scrypto-test to test your blueprints and packages you must update your Cargo.toml file to ensure that: - Your package is using resolver = "2". - scrypto-test is added as a dev-dependency. An example of a well-configured Cargo.toml file can be seen below [package] name = "radiswap" version = "0.1.0" edition = "2021" # 2. Your package is using resolver = "2". Setting Edition 2021 defaults the resolver to 2 [dependencies] sbor = { version = "1.2.0" } scrypto = { version = "1.2.0" } [dev-dependencies] scrypto-test = { version = "1.2.0" } [features] default = [] [profile.release] opt-level = 'z' # Optimize for size. lto = true # Enable Link Time Optimization. codegen-units = 1 # Reduce number of codegen units to increase optimizations. panic = 'abort' # Abort on panic. strip = true # Strip the symbols. overflow-checks = true # Panic in the case of an overflow. [lib] crate-type = ["cdylib", "lib"] [workspace] # Set the package crate as its own empty workspace, to hide it from any potential ancestor workspace # Remove this [workspace] section if you intend the package to be part of a Cargo workspace Test Environment At the heart of this testing framework is the TestEnvironment (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html) struct which encapsulates a self-contained instance of the Radix Engine, consisting of the entire engine stack from substate database, scrypto and native vms, track, system config, and kernel. Native blueprints are able to interact and make invocations to the engine through the ClientApi (https://docs.rs/scrypto/latest/scrypto/api/trait.ClientApi.html) . The TestEnvironment (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html) implements the ClientApi (https://docs.rs/scrypto/latest/scrypto/api/trait.ClientApi.html) as well, allowing tests to get the current actor, read substate, call methods and functions, etc. Blueprint Test Bindings This framework generates test bindings through the #[blueprint] attribute macro that provide a higher-level interface to use in tests such that you are not writing raw and untyped method and function calls. There are three main parts generated as part of the test bindings: - A ${BlueprintName}State struct: This is an automatically generated struct of the state that components of this blueprint have. - A ${BlueprintName} struct: This is a wrapper around a NodeId and is the struct that has the implementation of all of the methods and function on your blueprint. - The impl of the ${BlueprintName} struct: The implementation of the ${BlueprintName} struct is also autogenerated. - For each function that the blueprint has, there exists a function with the same name, arguments, and returns in the implementation of ${BlueprintName} but with two additional arguments of the address of the package and a mutable reference to the TestEnvironment. - For each method that the blueprint has, there exists a function with the same name, arguments, and returns in the implementation of ${BlueprintName} but with one additional argument which is mutable reference to the TestEnvironment. All of what’s mentioned above can be found in a module called _test. Test bindings make it very easy for you to write tests without the need to worry about getting string function or method names right, the order or type of argument. It adds an additional layer of type-safety that makes such errors easy to catch at compile-time. For cases where test bindings are not available, the TestEnvironment::call_method_typed (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.call_method_typed) and TestEnvironment::call_function_typed (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.call_function_typed) methods can be used. Enabling and Disabling System Modules at Runtime The Radix Engine kernel is designed to be modular with concepts such as auth, costing, limits, transaction runtime and others being system modules that may be enabled or disabled during the test runtime without the need for a new kernel. The modular design of the Radix Engine kernel proves to be useful when writing tests. As an example, you may want to not think about costing at all when writing tests and thus you may opt to disable the costing module entirely and continue your test without it. This can be done through the TestEnvironment::disable_costing_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_costing_module) method. The following table describes the state of each of the system modules when the TestEnvironment (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html) is first instantiated. System module Initial State Auth Module Enabled Limits Module Enabled Transaction Runtime Module Enabled Costing Module Disabled Kernel Trace Module Disabled Execution Trace Module Disabled Each of the system modules have four methods on the TestEnvironment (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html) struct: - A method to enable the system module (e.g., TestEnvironment::enable_auth_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.enable_auth_module) ). - A method to disable the system module (e.g., TestEnvironment::disable_auth_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_auth_module) ). - A method to enable the system module for some block of code and then reset the modules (e.g., TestEnvironment::with_auth_module_enabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_auth_module_enabled) ). - A method to disable the system module for some block of code and then reset the modules (e.g., TestEnvironment::with_auth_module_disabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_auth_module_disabled) ). For the block scoped methods, they cache the state of the system modules, enable or disable the system module based on the method that’s been called, execute the callback, and then set the system modules to what has been cached before the execution of the callback. An example of how the block-scoped methods are used can be found here (scrypto-test.md#how-to) . The following is a complete list of the methods used to manipulate the system modules. Auth Module Enable TestEnvironment::enable_auth_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.enable_auth_module) Disable TestEnvironment::disable_auth_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_auth_module) Block-scope Enabled TestEnvironment::with_auth_module_enabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_auth_module_enabled) Block-scope Disabled TestEnvironment::with_auth_module_disabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_auth_module_disabled) Limits Module Enable TestEnvironment::enable_limits_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.enable_limits_module) Disable TestEnvironment::disable_limits_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_limits_module) Block-scope Enabled TestEnvironment::with_limits_module_enabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_limits_module_enabled) Block-scope Disabled TestEnvironment::with_limits_module_disabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_limits_module_disabled) Transaction Runtime Module Enable TestEnvironment::enable_transaction_runtime_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.enable_transaction_runtime_module) Disable TestEnvironment::disable_transaction_runtime_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_transaction_runtime_module) Block-scope Enabled TestEnvironment::with_transaction_runtime_module_enabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_transaction_runtime_module_enabled) Block-scope Disabled TestEnvironment::with_transaction_runtime_module_disabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_transaction_runtime_module_disabled) Costing Module Enable TestEnvironment::enable_costing_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.enable_costing_module) Disable TestEnvironment::disable_costing_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_costing_module) Block-scope Enabled TestEnvironment::with_costing_module_enabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_costing_module_enabled) Block-scope Disabled TestEnvironment::with_costing_module_disabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_costing_module_disabled) Kernel Trace Module Enable TestEnvironment::enable_kernel_trace_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.enable_kernel_trace_module) Disable TestEnvironment::disable_kernel_trace_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_kernel_trace_module) Block-scope Enabled TestEnvironment::with_kernel_trace_module_enabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_kernel_trace_module_enabled) Block-scope Disabled TestEnvironment::with_kernel_trace_module_disabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_kernel_trace_module_disabled) Execution Trace Module Enable TestEnvironment::enable_execution_trace_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.enable_execution_trace_module) Disable TestEnvironment::disable_execution_trace_module (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.disable_execution_trace_module) Block-scope Enabled TestEnvironment::with_execution_trace_module_enabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_execution_trace_module_enabled) Block-scope Disabled TestEnvironment::with_execution_trace_module_disabled (https://docs.rs/scrypto-test/latest/scrypto_test/environment/env/struct.TestEnvironment.html#method.with_execution_trace_module_disabled) Creation of Buckets and Proofs The BucketFactory (https://docs.rs/scrypto-test/latest/scrypto_test/sdk/struct.BucketFactory.html) and ProofFactory (https://docs.rs/scrypto-test/latest/scrypto_test/sdk/struct.ProofFactory.html) are a part of this testing framework and they aim to provide an easy way for buckets and proofs to be created within tests, the strategy used for their creation is specified through a CreationStrategy (https://docs.rs/scrypto-test/latest/scrypto_test/sdk/enum.CreationStrategy.html) . Currently, there are two supported creation strategies: - CreationStrategy::DisableAuthAndMint (https://docs.rs/scrypto-test/latest/scrypto_test/sdk/enum.CreationStrategy.html#variant.DisableAuthAndMint) : This creation strategy disables the auth module, mints the amount required by the developer, and then reenables the auth module. Since the only thing done is the disabling of the auth module, this strategy respects all of the rules and checks of the resource before the minting takes place (e.g., NFTs match the NFT schema, they’re not created at tombstones, etc…​). - CreationStrategy::Mock (https://docs.rs/scrypto-test/latest/scrypto_test/sdk/enum.CreationStrategy.html#variant.Mock) : (Also known as creation out of thin air) This creation strategy does not go through the normal means of creating buckets and proofs, it creates a node with the expected substates as buckets or proofs and then hands it over to the caller. An advantage of this approach is that it doesn’t increase the total supply of the resource which may be useful when testing DeFi logic without wanting to worry increasing the total supply of the resource. However, this approach can be fragile in some cases. When mocking Buckets and Proofs the factory does not perform an exhaustive list of checks which means that you can get to some bad state if used incorrectly. Only use mocking if you understand the risks involved. The following is an example of the BucketFactory (https://docs.rs/scrypto-test/latest/scrypto_test/sdk/struct.BucketFactory.html) being used to create a bucket of XRD out of thin air (through CreationStrategy::Mock (https://docs.rs/scrypto-test/latest/scrypto_test/sdk/enum.CreationStrategy.html#variant.Mock) ) and without increasing the total supply. use scrypto_unit::prelude::*; #[test] fn creation_of_mock_fungible_buckets_succeeds() -> Result<(), RuntimeError> { // Arrange let mut env = TestEnvironment::new(); // Act let bucket = BucketFactory::create_fungible_bucket( XRD, 10.into(), Mock, &mut env )?; // Assert let amount = bucket.amount(&mut env)?; assert_eq!(amount, dec!("10")); Ok(()) } How To This section provides smaller sized examples and instructions on how to achieve some of the things you may be looking to do. How to publish a package? use scrypto_test::prelude::*; #[test] fn simple_package_can_be_published() -> Result<(), RuntimeError> { // Arrange let mut env = TestEnvironment::new(); // Act & Assert let package_address = PackageFactory::compile_and_publish(this_package!(), &mut env)?; Ok(()) } How to create a new resource? use scrypto_test::prelude::*; #[test] fn simple_resources_can_be_created_successfully() -> Result<(), RuntimeError> { // Arrange let mut env = TestEnvironment::new(); // Act & Assert let resource_address = ResourceBuilder::new_fungible(OwnerRole::None) .withdraw_roles(withdraw_roles! { withdrawer => rule!(require(resource_address)); withdrawer_updater => rule!(deny_all); }) .no_initial_supply(&mut env)? Ok(()) } How to do some operation with the auth module disabled? use scrypto_test::prelude::*; fn xrd_can_be_minted_when_auth_module_is_disabled() -> Result<(), RuntimeError> { // Arrange let mut env = TestEnvironment::new(); // Act let bucket = env.with_auth_module_disabled(|env| { /* Auth Module is disabled just before this point */ ResourceManager(XRD).mint_fungible(100.into(), env)? /* system modules are reset just after this point. */ }); // Assert let amount = bucket.amount(&mut env)?; assert_eq!(amount, dec!("100")); Ok(()) } The approach described here also applies to all other system modules and also to doing operations with the system modules enabled or disabled. How to have common arranges or teardowns? There are cases where you may have many tests that all share a large portion of some arrange or teardown logic. While this framework does not specifically provide solutions for sharing code across tests, there are many useful Rust patterns that may be employed here to allow you to do this: the simplest and the most elegant is probably by using callback functions. Imagine this, you are building a Dex and many of the tests you write require you to have two resources with a very large supply so you can write your tests with. One way to not have to write this bit of code in all of your tests is by having a function that creates the test environment, initializes it in the way you expect, calls a callback function provided by you, and then performs the teardown logic. Here is an example of the above: use scrypto_test::prelude::*; pub fn two_resource_environment(func: F) where F: FnOnce(TestEnvironment, Bucket, Bucket), { let mut env = TestEnvironment::new(); let bucket1 = ResourceBuilder::new_fungible(OwnerRole::None) .mint_initial_supply(dec!("100000000000"), &mut env) .unwrap(); let bucket2 = ResourceBuilder::new_fungible(OwnerRole::None) .mint_initial_supply(dec!("100000000000"), &mut env) .unwrap(); func(env, bucket1, bucket2) /* Potential teardown happens here */ } #[test] fn contribution_provides_expected_amount_of_pool_units() { two_resource_environment(|mut env, bucket1, bucket2| { /* Your test goes here */ }) } Simulated Ledger The SimulatedLedger is an in-memory ledger simulator which you can interact with.  As a user, you are submitting transactions to the ledger and get receipts back. Example The following is an example that uses the simulated ledger to test some of the functionality of Hello blueprint, from the radixdlt/radixdlt-scrypto (https://github.com/radixdlt/radixdlt-scrypto/tree/develop/examples/hello-world) . use scrypto_test::prelude::*; #[test] fn test_hello() { // Setup the environment let mut ledger = LedgerSimulatorBuilder::new().build(); // Create an account let (public_key, _private_key, account) = ledger.new_allocated_account(); // Publish package let package_address = ledger.compile_and_publish(this_package!()); // Test the `instantiate_hello` function. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_function( package_address, "Hello", "instantiate_hello", manifest_args!(), ) .build(); let receipt = ledger.execute_manifest( manifest, vec![NonFungibleGlobalId::from_public_key(&public_key)], ); println!("{:?}\n", receipt); let component = receipt.expect_commit(true).new_component_addresses()[0]; // Test the `free_token` method. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_method(component, "free_token", manifest_args!()) .call_method( account, "deposit_batch", manifest_args!(ManifestExpression::EntireWorktop), ) .build(); let receipt = ledger.execute_manifest( manifest, vec![NonFungibleGlobalId::from_public_key(&public_key)], ); println!("{:?}\n", receipt); receipt.expect_commit_success(); } ## Testing URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/testing/testing Updated: 2026-02-18 Summary: Legacy documentation: Testing Build > Scrypto > Testing — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/testing/README.md) ## BLS12-381 URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/cryptography/bls12-381 Updated: 2026-02-18 Summary: BLS12-381 implemented in Scrypto is compliant with following standards: Build > Scrypto > Cryptography > BLS12-381 — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/cryptography/bls12-381.md) BLS12-381 implemented in Scrypto is compliant with following standards: - IETF draft v05: BLS Signatures (https://datatracker.ietf.org/doc/html/draft-irtf-cfrg-bls-signature-05) - IETF RFC 9380: Hashing to Elliptic Curves (https://datatracker.ietf.org/doc/html/rfc9380) It uses ciphersuite BLS_SIG_BLS12381G2_XMD:SHA-256_SSWU_RO_POP_ , which translates into: - Signature G2 variant (size 96 bytes) - Public key G1 variant (size 48 bytes) - Signature scheme POP (proof of possession) Following methods are implemented within CryptoUtils module: - bls12381_v1_verify() - Verify a G2 signature of given message with given public key - bls12381_v1_aggregate_verify() - Verify an aggregated G2 signature using given vector of messages and public key tuples - bls12381_v1_fast_aggregate_verify() - Verify an aggregated G2 signature using a given message with given multiple keys - bls12381_g2_signature_aggregate() - Aggregate multiple G2 signatures into a single one See the examples below: use scrypto::prelude::*; #[blueprint] mod crypto_example { struct CryptoScrypto {} impl CryptoScrypto { pub fn bls12381_g1_public_key_conversions() { // Bls12381G1PublicKey might be converted from: // - hex string // - vector of bytes (48 bytes long) // Bls12381G1PublicKey might be converted to vector of bytes. let pub_key_str = "93b1aa7542a5423e21d8e84b4472c31664412cc604a666e9fdf03baf3c758e728c7a11576ebb01110ac39a0df95636e2"; let pub_key = match Bls12381G1PublicKey::from_str(pub_key_str) { Ok(pub_key) => pub_key, Err(err) => panic!( "Error getting Bls12381G1PublicKey from str, error: {:?}", err ), }; let pub_key_bytes = pub_key.to_vec(); // Convert to vector of bytes // let _pub_key = match Bls12381G1PublicKey::try_from(pub_key_bytes.as_slice()) { Ok(pub_key) => pub_key, Err(err) => panic!( "Error getting Bls12381G1PublicKey from bytes, error: {:?}", err ), }; } pub fn bls12381_g2_signature_conversions() { // Bls12381G2Signature might be converted from: // - from hex string // - from vector of bytes (96 bytes long) // Bls12381G2Signature might be converted to vector of bytes let signature_str = "82131f69b6699755f830e29d6ed41cbf759591a2ab598aa4e9686113341118d1db900d190436048601791121b5757c341045d4d0c94a95ec31a9ba6205f9b7504de85dadff52874375c58eec6cec28397279de87d5595101e398d31646d345bb"; let signature = match Bls12381G2Signature::from_str(signature_str) { Ok(signature) => signature, Err(err) => panic!( "Error getting Bls12381G2Signature from str, error: {:?}", err ), }; let signature_bytes = signature.to_vec(); // Convert to vector of bytes let _signature = match Bls12381G2Signature::try_from(signature_bytes.as_slice()) { Ok(signature) => signature, Err(err) => panic!( "Error getting Bls12381G2Signature from bytes, error: {:?}", err ), }; } pub fn bls12381_v1_verify( message: Vec, pub_key: Bls12381G1PublicKey, signature: Bls12381G2Signature, ) -> bool { CryptoUtils::bls12381_v1_verify(message, pub_key, signature) } pub fn bls12381_v1_aggregate_verify( pub_keys_msgs: Vec<(Bls12381G1PublicKey, Vec)>, signature: Bls12381G2Signature, ) -> bool { // pub_keys_msgs is a vector of tuples of public keys and the corresponding messages CryptoUtils::bls12381_v1_aggregate_verify(pub_keys_msgs, signature) } pub fn bls12381_v1_fast_aggregate_verify( message: Vec, pub_keys: Vec, signature: Bls12381G2Signature, ) -> bool { CryptoUtils::bls12381_v1_fast_aggregate_verify(message, pub_keys, signature) } pub fn bls12381_g2_signature_aggregate( signatures: Vec, ) -> Bls12381G2Signature { CryptoUtils::bls12381_g2_signature_aggregate(signatures) } } } ## Keccak256 URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/cryptography/keccak256 Updated: 2026-02-18 Summary: See the example below: Build > Scrypto > Cryptography > Keccak256 — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/cryptography/keccak256.md) Keccak256 digest is being calculated over given vector of bytes. See the example below: use scrypto::prelude::*; #[blueprint] mod crypto_example { struct CryptoScrypto {} impl CryptoScrypto { pub fn keccak256_hash(data: Vec) -> Hash { let hash = CryptoUtils::keccak256_hash(data); hash } } } ## Cryptography URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/cryptography/cryptography Updated: 2026-02-18 Summary: Scrypto provides a set of precompiled cryptographic primitives, which are executed natively (rather than within a WASM Virtual Machine) to reduce execution cost Build > Scrypto > Cryptography — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/cryptography/README.md) Scrypto provides a set of precompiled cryptographic primitives, which are executed natively (rather than within a WASM Virtual Machine) to reduce execution costs. Those primitives are implemented in a dedicated CryptoUtils module. At the moment, the supported algorithms are: - Keccak256 (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/cryptography/keccak256.md) - BLS12-381 (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/cryptography/bls12-381.md) Please note that CryptoUtils is only available after Anemone protocol update. ## Using Royalties URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/royalties/using-royalties Updated: 2026-02-18 Summary: Royalties can be configured on methods and functions by a package publisher, or an instantiator, and are paid by transactions interacting with the given package Build > Scrypto > Royalties > Using Royalties — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/royalties/using-royalties.md) Royalties can be configured on methods and functions by a package publisher, or an instantiator, and are paid by transactions interacting with the given package or component as part of the transaction fee. - A package publisher may configure "developer royalties" to be applied to a package blueprint’s functions, or to component methods such that any component instantiated from that blueprint pays these royalties. - A component owner may additionally configure "owner royalties" to the methods of the component. Example uses might include: - A developer adding a package-level royalty on a blueprint’s constructor, to charge the instantiator. - A developer adding a small package-level royalty for blueprint methods, to charge end-users. - An oracle owner charging a small (fixed) component royalty for using an oracle component, to help cover costs to maintain the oracle. Royalty Amount A royalty is always charged in XRD as part of the transaction fee, but a royalty amount may be configured as either XRD or "approximate USD equivalent". When specified as "approximate USD equivalent", a constant multiplier is used to calculate the XRD charged. This constant is defined for a given protocol version, and updating of this multiplier is done at a protocol update. A large shift in the price of XRD will trigger creation of a protocol update to adjust this value, as per the policy in this blog post (https://www.radixdlt.com/blog/planned-updates-to-radix-network-fees-and-emissions) . As such, charging a royalty as an "approximate USD equivalent" will likely be more stable in terms of real-world cost to end-users, in cases where the XRD price has large changes since the royalty was set. There is also a maximum cap which can be charged as a royalty per method, this may be updated at future protocol updates. Component Royalties In order to charge component royalties, they must be enabled at instantiation time. See the example below: .instantiate() //.. .enable_component_royalties(component_royalties! { // The roles section is optional, if missing, all roles default to OWNER roles { royalty_setter => rule!(allow_all); royalty_setter_updater => OWNER; royalty_locker => OWNER; royalty_locker_updater => rule!(deny_all); royalty_claimer => OWNER; royalty_claimer_updater => rule!(deny_all); }, init { public_method => Xrd(1.into()), updatable; public_method_2 => Usd(1.into()), updatable; protected_method => Free, locked; } }) .globalize() Component royalties can be claimed, set and updated with proofs for the relevant role present, using this manifest instruction: CLAIM_COMPONENT_ROYALTIES Address("${component_address}"); SET_COMPONENT_ROYALTY Address("${component_address}") "my_method" # OR Enum(Decimal("1")) # OR Enum(Decimal("1")) Enum(); LOCK_COMPONENT_ROYALTY Address("${component_address}") "my_method"; Royalty Updating A component royalty on a method can be set as either locked or updatable. An updatable royalty can later be locked. Note that having an updatable royalty on a method may discourage people from using your component or package. Package Royalties Package royalties are optionally defined inside the blueprint’s module using the enable_package_royalties! invocation, and are fixed. #[blueprint] mod my_component { enable_package_royalties! { new => Xrd(2.into()); another_function => Xrd(2.into()); public_method => Free; protected_method => Free; } struct MyComponent { //... } impl MyComponent { //... } } Claiming Package Royalties Package royalties can be claimed by the package owner using the following manifest instruction: CLAIM_PACKAGE_ROYALTIES Address("${package_address}"); All royalties generated by the package are directed to the package royalty accumulator vault. They can be claimed using the CLAIM_PACKAGE_ROYALTIES instruction. Full Example We recommended giving Package Owners the ability to claim Package Royalties. In the following example, the Package Owner provides proof using create_proof_of_non_fungibles. You can verify the required proof in the Administrative Roles section of the Package on Mainnet (https://dashboard.radixdlt.com/) and Stokenet (/contents/tech/releases/stokenet) (Search for your package > Configuration > Administrative Roles). You can submit a Raw Transaction on the Mainnet Console (https://console.radixdlt.com/transaction-manifest) and Stokenet Console (https://stokenet-console.radixdlt.com/transaction-manifest) . CALL_METHOD Address("account_rdx12xuhw6v30chdkhcu7qznz9vu926vxefr4h4tdvc0mdckg9rq4afx9t") "create_proof_of_non_fungibles" Address("resource_rdx1nfxxxxxxxxxxpkgwnrxxxxxxxxx002558553505xxxxxxxxxpkgwnr") Array( NonFungibleLocalId("[0da90c7658d1ba2299bfb87939519cff4ae59772e8fa7efb16f7c61dfbcf]") ) ; CLAIM_PACKAGE_ROYALTIES Address("package_rdx1pk5scajc6xaz9xdlhpunj5vula9wt9mjara8a7ck7lrpm770698ygs"); CALL_METHOD Address("account_rdx12xuhw6v30chdkhcu7qznz9vu926vxefr4h4tdvc0mdckg9rq4afx9t") "deposit_batch" Expression("ENTIRE_WORKTOP"); ## Royalties URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/royalties/royalties Updated: 2026-02-18 Summary: Legacy documentation: Royalties Build > Scrypto > Royalties — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/royalties/README.md) ## Entity Metadata URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-entity-metadata Updated: 2026-02-18 Summary: See the following articles: Build > Scrypto > Entity Metadata — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-entity-metadata.md) See the following articles: - Metadata Technical Model (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/metadata/entity-metadata.md) - Metadata Standards (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/README.md) For details about setting metadata from Scrypto, for now, see the Metadata Technical Model. ## Coverage tool URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/coverage-tool Updated: 2026-02-18 Summary: Scrypto coverage tool allows you to check the test coverage of your blueprint code by running the tests with instrumented wasm file. Build > Scrypto > Coverage tool — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/coverage-tool.md) Scrypto coverage tool allows you to check the test coverage of your blueprint code by running the tests with instrumented wasm file. Installation Scrypto coverage is supported since Scrypto v1.2.0. Follow this guide (https://github.com/radixdlt/radixdlt-scrypto?tab=readme-ov-file#installation) for installation. Compatibility Currently, the supported environment is Linux amd64 with Rust 1.81.0-nightly. To install the specific rust version, run rustup install nightly-2024-07-18-x86_64-unknown-linux-gnu. How to run Inside you blueprint folder, execute command scrypto coverage. It will compile the package, run tests, and generate a HTML report in coverage/report folder. Example report Additionally, source of all analyzed files can be viewed. All lines which were not visited during tests execution are marked. More information about interpreting report data can be found here (https://clang.llvm.org/docs/SourceBasedCodeCoverage.html#interpreting-reports) . Step by step instructions Running coverage tool You can follow the following steps to run coverage on example blueprint: - Install Radix CLI tools: cargo install --force radix-clis@1.2.0 - Create a test package scrypto new-package hello - Run coverage tool: cd hello RUSTUP_TOOLCHAIN=nightly-2024-07-18 scrypto coverage Environment setup Clean OS using Docker Following instruction will prepare clean OS using Docker image (alternatively local instance of Linux can be used): - Pull new image and get its ID: docker pull ubuntu:mantic docker images - Start a container (replace tag with value obtained from step 1) and get its ID: docker run -t -d docker ps -a - Login into the container (replace tag with value obtained from step 2): docker exec -i -t /bin/bash Dependency installation - Install required packages apt-get update apt install build-essential llvm cmake clang
 wget curl git - Install Rust compiler curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh source $HOME/.cargo/env rustup target add wasm32-unknown-unknown - Install supported Rust toolchain (Rust 1.81.0-nightly) rustup install nightly-2024-07-18 rustup target add wasm32-unknown-unknown --toolchain nightly-2024-07-18 - Install supported llvm version, for Rust 1.81.0-nightly it is version 18: apt install lsb-release wget software-properties-common gnupg bash -c "$(wget -O - https://apt.llvm.org/llvm.sh 18)" apt install llvm-18 - Now you can go to Running Coverage Tool (coverage-tool.md#running-coverage-tool) instruction. ## Scrypto Events URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-events Updated: 2026-02-18 Summary: Events in Scrypto are a way to communicate to off chain clients. They are emitted by the component and can be listened for to begin secondary actions with the G Build > Scrypto > Scrypto Events — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-events.md) Events in Scrypto are a way to communicate to off chain clients. They are emitted by the component and can be listened for to begin secondary actions with the Gateway or Core APIs. There are many events that already exist in the native blueprints, e.g. the WithdrawEvent on accounts: #[derive(ScryptoSbor, ScryptoEvent, Debug, PartialEq, Eq)] pub enum WithdrawEvent { Fungible(ResourceAddress, Decimal), NonFungible(ResourceAddress, IndexSet), } Observing Events Events can be observed in the transaction stream. To receive events in this stream the following opt-in must be included in the request body: "opt_ins": { "receipt_events": true } Gateway API Stream endpoint documentation can be found here (https://radix-babylon-gateway-api.redoc.ly/#operation/StreamTransactions) . Events can also be observed in the events section of transaction receipts e.g: { "status": "CommittedSuccess", // --snip-- "events": [ { "name": "LockFeeEvent", "emitter": { "type": "Method", "entity": { "is_global": false, "entity_type": "InternalFungibleVault", "entity_address": "internal_vault_tdx_2_1tpv5433wcz7hyg0tutnp0960km5sda8p0jmdnaqanpjw43ucwhyey6" }, "object_module_id": "Main" }, "data": { "kind": "Tuple", "type_name": "LockFeeEvent", "fields": [ { "kind": "Decimal", "field_name": "amount", "value": "0.6118835683725" } ] } }, { "name": "WithdrawEvent", "emitter": { "type": "Method", "entity": { "is_global": false, "entity_type": "InternalFungibleVault", "entity_address": "internal_vault_tdx_2_1tpv5433wcz7hyg0tutnp0960km5sda8p0jmdnaqanpjw43ucwhyey6" }, "object_module_id": "Main" }, "data": { "kind": "Tuple", "type_name": "WithdrawEvent", "fields": [ { "kind": "Decimal", "field_name": "amount", "value": "3" } ] } }, // --snip-- } These are available with a Gateway API Committed Transaction Details endpoint (https://radix-babylon-gateway-api.redoc.ly/#operation/TransactionCommittedDetails) Custom Events To create your own events you can register and emit them directly from Scrypto. use scrypto::prelude::*; #[derive(ScryptoSbor, ScryptoEvent)] struct RegisteredEvent { number: u64, } #[blueprint] #[events(RegisteredEvent)] mod example_event { struct ExampleEvent; impl ExampleEvent { pub fn emit_registered_event(number: u64) { Runtime::emit_event(RegisteredEvent { number }); } } } The Blueprint macro expects an optional #[events(…​)] attribute for the event registration. Multiple attributes could be provided to add more events, i.e: #[blueprint] #[event(NewUserEvent, UserIsNoMoreEvent, UserIsUserEvent)] mod blueprint { struct Club {} impl Club {} } ## Bech32 Address Types Conversion URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/bech32-address-types-conversion Updated: 2026-02-18 Summary: To convert a Bech32 address to an Address type in Scrypto (e.g., ComponentAddress), you can use the following example for a ComponentAddress on mainnet: Build > Scrypto > Bech32 Address Types Conversion — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/bech32-address-types-conversion.md) Converting Bech32 to Address Type To convert a Bech32 address to an Address type in Scrypto (e.g., ComponentAddress), you can use the following example for a ComponentAddress on mainnet: let my_bech32_address = "component_tdx_2_1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxyulkzl"; let my_component_address = ComponentAddress::try_from_bech32( &AddressBech32Decoder::new(&NetworkDefinition::stokenet()), &my_bech32_address ).unwrap(); This approach is applicable for other address types such as ResourceAddress. Simply replace ComponentAddress with the desired address type. If you are working in a different environment, replace ::stokenet() with the appropriate network, such as ::mainnet() or ::simulator(). Converting Address Type to Bech32 To convert an Address type (e.g., ResourceAddress) to a Bech32 encoded address, you can use the following method: let my_resource_address = resource_manager.address(); let my_bech32_address = Runtime::bech32_encode_address(my_resource_address); This method works for any address type. Replace resource_manager.address() with the appropriate method to obtain the address you want to encode. By following these examples, you can easily convert between Bech32 encoded addresses and their respective Address types across different environments. ## Runtime API URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/runtime Updated: 2026-02-18 Summary: Legacy documentation: Runtime API Build > Scrypto > Runtime API — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/runtime.md) https://docs.rs/scrypto/latest/scrypto/runtime/runtime/struct.Runtime.html (https://docs.rs/scrypto/latest/scrypto/runtime/runtime/struct.Runtime.html) https://docs-babylon.radixdlt.com/main/scrypto/system/runtime.html (https://docs-babylon.radixdlt.com/main/scrypto/system/runtime.html) ## Advanced External Calls URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/advanced-external-calls Updated: 2026-02-18 Summary: From scrypto, you may wish to make calls to a method or blueprint function. This is known as an external, or cross-blueprint call. This allows a developer to cr Build > Scrypto > Advanced External Calls — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/advanced-external-calls.md) From scrypto, you may wish to make calls to a method or blueprint function. This is known as an external, or cross-blueprint call. This allows a developer to create complex systems by composing various blueprints and components together. An introduction to these ideas can be found in Use External Blueprints (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-use-external-blueprints.md) and Use External Components (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-use-external-components.md) . This document covers more combinations of use-cases. Calling a specific blueprint or global component of your package First, you need to get some blueprint stub, T. A blueprint stub comes in two parts: - It has an interface - a set of blueprint functions; and component method definitions. - Some also have a blueprint identity - e.g. a package address and blueprint name. This is used by the engine to validate that you have the right object. Currently, this blueprint stub type T can come from a few places. These will be covered in further detail in this article: - Inside your current package, from another #[blueprint] macro. - From the extern_blueprint! pseudo-macro inside the #[blueprint] definition of one of your components. - Using the built-in AnyComponent, which has no pre-defined methods nor identity. - Soon, we’ll support other mechanisms to create these stubs, without hard-coding a blueprint address. For a blueprint stub type T, you can make external calls by using: - Blueprint to make function calls to a package with a known address. Specifically: - If T captures an identity, you don’t need to create a Blueprint, you can just use it statically: Blueprint::::function_call(..) - Global to make method calls to global components. It can also be used to update roles, metadata etc. Specifically: - You can create a Global from an address, or more efficiently, you can save it to component state, or receive it in place of an address as a method/function argument. - If T captures a blueprint identity, receiving a Global as an argument, returning it, saving it to state, or constructing it, will all trigger validations that the address has the given blueprint. - You can then use it to make calls. e.g. Global::::from(address).method_call(..). - Owned to make method calls to owned or internally mounted components. - This is created and used in the same way as Global, however it represents unique ownership of the given component, rather than a shared reference to a global component. - An owned component does not have access rules, metadata etc, and can only be called directly by its owner. Each of these can be used to make the corresponding typed function or method calls, but can also be used to make raw / untyped calls if needed. Using a blueprint from the same package You might decide to combine multiple blueprints in the same package. This allows you to easily deploy complex inter-blueprint functionality to the ledger. In this section, you will learn how to call a blueprint from another one living in the same package. Let’s say you have two blueprints: CoffeeMachine and AlarmClock. If you want to be able to instantiate a CoffeeMachine component and call its methods from one of the AlarmClock’s method/function you would create three files: src/lib.rs // Import the blueprints that are part of the package mod coffee_machine; mod alarm_clock; This lib.rs file is the starting point of all Scrypto packages. If you have only one blueprint in the package, you could write the logic directly in that file, like we saw previously. In our case, we will write the logic of the two blueprints in separate files. That’s why in lib.rs we are importing the two other files to include in our package (coffee_machine and alarm_clock) with the mod keyword. src/coffee_machine.rs use scrypto::prelude::*; #[blueprint] mod coffee_machine { struct CoffeeMachine {} impl CoffeeMachine { pub fn new() -> Owned { Self{}.instantiate() } pub fn make_coffee(&self) { info!("Brewing coffee !"); } } } - Here we need to return Owned which is magic syntax that the #[blueprint] macro allows. - Also notice that we do not call the globalize() method after instantiation. This is because we want our component to be instantiated as a local or owned component. Having an owned component will only be accessible by our second blueprint that we are going to go through in the next section. This file includes the logic for the CoffeeMachine blueprint. This blueprint offers a function to instantiate a component with an empty state that offers a make_coffee() method, which we will call from the AlarmClock blueprint. src/alarm_clock.rs use scrypto::prelude::*; use crate::coffee_machine::coffee_machine::*; // #1 #[blueprint] mod alarm_clock { struct AlarmClock { // Store the coffee machine component coffee_machine: Owned } impl AlarmClock { pub fn new() -> Global { Self{ coffee_machine: CoffeeMachine::new() // #2 } .instantiate() .prepare_to_globaize(OwnerRole::None) .globalize() } pub fn try_trigger(&mut self) { assert!(Runtime::current_epoch() % 100 == 0, "It's not time to brew yet !"); self.coffee_machine.make_coffee(); // #3 } } } - Import the CoffeeMachine blueprint - Instantiate a CoffeeMachine component from the blueprint - Call methods on the component First, this blueprint imports the CoffeeMachine Blueprint at the top of the file. Then, it instantiates a new CoffeeMachine component and stores it inside a newly instantiated AlarmClock component. Finally, in the try_trigger method, the CoffeeMachine’s make_coffee method is called. Using extern_blueprint! Watch out for Network Dependence When creating packages containing static addresses through the extern_blueprint! pseudo-macro and global_component! pseudo-macro, the built WASM and Package Definition are network-dependent. You will need to be careful to version your source files (or update the addresses in them) before building for different networks, and ensure that the resultant artifacts (WASM and .rpd files) are kept separate between networks. There is a workaround for extern_blueprint! in the next section, but we have plans in the near future to make this more intuitive. The current implementation of the extern_blueprint! must live inside a #[blueprint] definition. It takes a package address, a blueprint name, and a blueprint interface and does a few things: - It generates a stub type T for the external blueprint. This stub includes scrypto methods, and capture's the blueprint's identity (package address and blueprint name). - It registers the external blueprint's package address as a static dependency of your #[blueprint], so that the engine can allow your blueprint to call the external blueprint without having received or loading the package address at runtime. This package address is validated at package upload time. This is explained in more detail in the blue box below. If the external blueprint requires any custom structs or enums, then you can copy their definitions verbatim from the source code for the blueprint, and referenced inside the macro (along with its derives). The global_component!{ .. } pseudo-macro can similarly be used to register a static dependency on a fixed component address. // =================================================================== // EXAMPLE 1 - capturing the functions on the GumballMachine blueprint // =================================================================== extern_blueprint! { "package_sim1p4kwg8fa7ldhwh8exe5w4acjhp9v982svmxp3yqa8ncruad4rv980g", GumballMachine { // Blueprint Functions fn instantiate_gumball_machine(price: Decimal) -> Global; // Component Methods fn get_price(&self) -> Decimal; fn buy_gumball(&mut self mut payment: Bucket) -> (Bucket, Bucket); } } // ======================================= // EXAMPLE 2 - Demonstrating a custom type // ======================================= // Since the DepositResult enum is required by this blueprint, it needs to be defined outside of the // extern_blueprint! macro, and then used in the function or method signatures. // You can obtain the definition of the enum from the source code of the external blueprint. #[Derive(ScryptoSbor)] enum DepositResult { // #1 Success, Failure } extern_blueprint! { "package_sim1p4kwg8fa7ldhwh8exe5w4acjhp9v982svmxp3yqa8ncruad4rv980g", CustomAccountComponentTarget as MyAccountComponentTarget { // Alias fn deposit(&mut self, b: Bucket) -> DepositResult; fn deposit_no_return(&mut self, b: Bucket); fn read_balance(&self) -> Decimal; } } What is “node visibility”? What is a static dependency of a blueprint? Registering an address as a static dependency allows the blueprint to make calls to it without receiving an error about a node reference not being visible. This error protects the engine against blueprints/components making calls to objects they have no right knowing about. To prevent this error, the called global address needs to become visible to the call frame. This can be achieved in a few ways: - The called address can be a static dependency of the current blueprint. - The called address can be read from state. - The called address can be passed in as an argument. Calling a static component address with global_component! Once the blueprint package or component is imported we can use the global_component! pseudo-macro to reference the GumballMachine component. Note that global_component! only works inside the #[blueprint] mod. This registers the given component address as a static dependency of the built package, which allows the package to make calls to the component without the component address being stored or passed into the component. You can also just create a Global at runtime by receiving it as an argument, reading it from state, or calling address.into(). This doesn’t require knowing a static address ahead-of-time. pub fn proxy_buy_gumball(&self, mut payment: Bucket) -> (Bucket, Bucket) { let gumball_component: Global = global_component!( GumballMachine, "component_sim1crtkvhxwuff6vk7weufhj9qsd8u7ekajz9zllmqd29mlm8mlxrvsru" ); return gumball_component.buy_gumball(payment) } A Note on Type Checking The global_component! macro does not check if the component is of the blueprint type. Be sure to verify the blueprint info of the component on Radix Explorer before using it. Workaround: Using extern_blueprint! without package checks It’s quite common to want a typed interface definition in Global - but not to have to write network-dependent code, deal with blueprint identity check errors, or have to drop to manually coded calls with Global. We will have official support for this soon, but for the time being there is a slightly awkward workaround which can be used to use Global from extern_blueprint! { ... } whilst avoiding the identity checks: use scrypto::prelude::*; #[blueprint] mod candy_store { extern_blueprint! { // 1. We use a native address which exists on every network here. // The precise network doesn't matter, so we use the mainnet // address for the package package. This will also add it as a // static dependency, but that won't matter. "package_rdx1pkgxxxxxxxxxpackgexxxxxxxxx000726633226xxxxxxxxxpackge", GumballMachine { // Blueprint Functions fn instantiate_global( price: Decimal ) -> ( Global, Bucket); fn instantiate_owned( price: Decimal, component_address: ComponentAddress ) -> Owned; // Component Methods fn get_status(&self) -> Status; fn buy_gumball(&mut self, payment: Bucket) -> (Bucket, Bucket); fn set_price(&mut self, price: Decimal); fn withdraw_earnings(&mut self) -> Bucket; fn refill_gumball_machine(&mut self); } } pub struct CandyStore { // 2. Instead of using a `Global`, we just save a // `ComponentAddress` in the state which means that we can avoid // the blueprint identity assertions when state is saved. pub gumball_machine: ComponentAddress, } impl CandyStore { pub fn new() -> Global { todo!() } pub fn get_status_from_gumball_machine(&self) -> Status { // 2. We call the `gumball_machine` method which constructs a temporary // `Global` for us let gumball_machine = self.gumball_machine(); // 3. We call a method on the generated stub. let status = gumball_machine.get_status(); // Print the output or do something with it. info!("{status:#?}"); } // 2. We create a temporary `Global` using some unnatural // methods which avoid any blueprint identity assertions. fn gumball_machine(&self) -> Global { Global::(GumballMachine { handle: ObjectStubHandle::Global(self.gumball_machine.into()), }) } } } Calling a component with an address from state struct MyGumballProxy { // Both of these store component addresses. The latter is validated by the engine upon saving that it matches the given package/blueprint. gumball_machine_component_address: ComponentAddress, gumball_machine_component: Global, } impl MyGumballProxy { // ... pub fn proxy_buy_gumball(&self, mut payment: Bucket) -> (Bucket, Bucket) { let gumball_component: Global = self.gumball_machine_component_address.into(); return gumball_component.buy_gumball(payment) } pub fn proxy_buy_gumball_2(&self, mut payment: Bucket) -> (Bucket, Bucket) { return self.gumball_machine_component.buy_gumball(payment) } } Calling a dynamic component address In the following example, the gumball_component parameter can be a component address. The engine verifies that the passed address belongs to a component whose blueprint matches (in this case, GumballMachine under package_sim1p4kwg8fa7ldhwh8exe5w4acjhp9v982svmxp3yqa8ncruad4rv980g). impl MyGumballProxy { // This is equivalent to proxy_buy_gumball_2, except the engine also validates the component's blueprint matches the package address/blueprint name in the external_blueprint definition. pub fn proxy_buy_gumball(&self, gumball_machine_component: Global, mut payment: Bucket) -> (Bucket, Bucket) { return gumball_machine_component.buy_gumball(payment) } // This is equivalent to proxy_buy_gumball, EXCEPT the engine does not validate that the gumball_machine_component matches the package address/blueprint in the external_blueprint definition. pub fn proxy_buy_gumball_2(&self, gumball_machine_component: ComponentAddress, mut payment: Bucket) -> (Bucket, Bucket) { let gumball_machine_component: Global = gumball_machine_component.into(); return gumball_machine_component.buy_gumball(payment) } } Calling a blueprint function This can be done with Blueprint where X is a Blueprint defined via extern_blueprint! or from another definition in the given package. struct MyGumballProxy { gumball_machine: Global, } impl MyGumballProxy { pub fn instantiate_proxy(price: Decimal) -> Global { // This can call the function on the GumballMachine blueprint // NOTE: The `extern_blueprint!` definition MUST be inside this #[blueprint] mod for the static depedency on the package to be picked up, // and to avoid a reference error at runtime. let created_gumball_machine = Blueprint::::instantiate_gumball_machine(price); Self { gumball_machine: created_gumball_machine, } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } } Calling a component with any blueprint This can be done with Global, which doesn’t do any validation on the blueprint. You can create this as let component: Global = Global::from(component_address); However, be aware that you can’t currently easily add nice method calls. In the future we may add an ability to define interfaces without the blueprint validation. Please discuss this in the #scrypto channel on Discord if this would be useful for you. Calling a package Sometimes you want to read metadata from a package, or interact with a package as a package. For that, you can use the Package type (which behind the scenes is a type alias for Global). This can be used like this: let my_package: Package = Runtime::package_address().into(); let my_package_description: String = my_package.get_metadata("description").unwrap().unwrap(); ## Component Ownership URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/component-ownership Updated: 2026-02-18 Summary: Scrypto allows a component to own other components, similar to how a component can own vaults. If a component is owned by another component, its methods may onl Build > Scrypto > Component Ownership — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/component-ownership.md) Scrypto allows a component to own other components, similar to how a component can own vaults. If a component is owned by another component, its methods may only be accessed by the blueprint of the parent component. An example of component ownership: use scrypto::prelude::*; #[blueprint] mod child { struct Child { name: String, } impl Child { fn new_component(name: String) -> ChildComponent { Self { name }.instantiate() } pub fn get_name(&self) -> String { self.name.to_string() } } } #[blueprint] mod parent { struct Parent { child0: ChildComponent, child1: ChildComponent, } impl Parent { fn new_component() -> ParentComponent { let child0: ChildComponent = Child::new("child0".to_string()); let child1: ChildComponent = Child::new("child1".to_string()); // Move the two child components into a new parent component let parent_component = Self { child0, child1, }.instantiate(); // child0 and child1 are now owned by this parent_component parent_component } pub fn new_globalized() -> Global { Self::new_component() .prepare_to_globalize(OwnerRole::None) .globalize() } pub fn get_child0_name(&self) -> String { self.child0.get_name() } pub fn get_child1_name(&self) -> String { self.child1.get_name() } } } Note in the above example how the name of the struct gets an automatic suffix of Component when instantiated. ## Logging URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/logging Updated: 2026-02-18 Summary: Logs are very useful for debugging and security purpose. To emit a log in Scrypto, you will need to use one of the following macros. Build > Scrypto > Logging — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/logging.md) Logs are very useful for debugging and security purpose. To emit a log in Scrypto, you will need to use one of the following macros. - error! for error or critical messages - warn! for a warning - info! for informational messages - debug! for debugging - trace! for tracing All macros support both simple and formatted messages: info!("This is a simple message"); info!("This is a formatted message: {} + {} = {}", 1, 2, 1 + 2); In case the variable is of a type which doesn’t implement the Display trait but the Debug trait, you will need to replace {} with {:?}, for instance: debug!("I'm debuging {:?}", this_structure); ## Functions and Methods URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/functions-and-methods Updated: 2026-02-18 Summary: Functions and methods form the primary part of the implementation of a blueprint; they define the behavior to accomplish specific tasks. Calls to functions and Build > Scrypto > Functions and Methods — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/functions-and-methods.md) Functions and methods form the primary part of the implementation of a blueprint; they define the behavior to accomplish specific tasks. Calls to functions and methods are also how transactions interact with blueprints and components on the Radix network. Functions Functions are similar to what you think of as a "static function" in other languages. They do not depend on any internal state, and can be called directly on a blueprint. Typically, a blueprint offers at least one function that performs instantiation of a component. Functions can have input parameters and return values, they just can’t have a reference to self. Here’s an example of a function signature that instantiates a component: pub fn instantiate_my_component(init_count: u32) -> Global { Methods Methods require a reference to self, and can read and modify internal state. Methods can only be called on components (blueprints don’t have internal state). The first argument of a method is either &self or &mut self: - With &self, the statements within the method can only read the component state - With &mut self, the statements can read and write the component state Some example method signatures: pub fn get_value(&self) -> u32 { pub fn update_value(&mut self, new_value: u32) { Visibility All public functions and methods defined in a blueprint may be invoked by external entities via transactions or calls from other functions or methods. Methods are only invokable on an instantiated component. To check what functions and methods are available on a blueprint deployed to your local simulator, run this command: resim export-abi Private functions and methods You can create methods and functions that are only callable from within the blueprint itself by removing the pub keyword in front of the signature: use scrypto::prelude::*; #[blueprint] mod gumball_machine { struct GumballMachine { gumballs: Vault } impl GumballMachine { pub fn instantiate() -> Global { let bucket = ResourceBuilder::new_fungible() .divisibility(DIVISIBILITY_NONE) .mint_initial_supply(1000); Self { gumballs: Vault::with_bucket(bucket) } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } // This is a public method meaning that it will be callable from outside pub fn free_gumball(&mut self) -> Bucket { self.print_vault_info(); self.gumballs.take(1) } // This is a private method. You cannot call it with `resim call-method`. fn print_vault_info(&self) { info!("Amount of gumballs left: {}", self.gumballs.amount()); } } } In this example, print_vault_info() can only be called from within this blueprint. It will not appear if you run resim export-abi [package_address] GumballMachine. ## Data Types URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/data-types Updated: 2026-02-18 Summary: Scrypto is based on Rust which is a statically typed language. All variables are associated with a type, either explicitly specified or inferred by the compiler Build > Scrypto > Data Types — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/data-types.md) Scrypto is based on Rust which is a statically typed language. All variables are associated with a type, either explicitly specified or inferred by the compiler. In this section, we describe how different data types are supported. Primitive Types Most primitive types are supported. You’re free to use any of the following types: - i8, i16, i32, i64, i128, isize - u8, u16, u32, u64, u128, usize - String isize and usize are compiled into i32 and u32 respectively. Safe Types Safe types are types that are guaranteed to panic, when they overflow as opposed to primitive types. The following types are supported: - I8, I16, I32, I64, I128, I256, I384, I512 - U8, U16, U32, U64, U128, U256, U384, U512 - Decimal, PreciseDecimal Notice that there are types that have up to 512 bits of precision. // builtin types wrap silently instead of panicking in case of overflow let a: i32 = i32::MAX; let b: i32 = i32::MAX + 1; // b is wrapping, no panic // as opposed safe types panic in case of overflow let c: I32 = I32::MAX; let d: I32 = I32::MAX + 1; // d is overflowing, panic Using safe types ensures that a transaction on RadixDLT will be rejected if any of the used safe types would overflow. This is important to make your code safe. Decimals You might have noticed that there are no f32 or f64 listed in the previous section. That is because floating point arithmetic is not deterministic and does not work in distributed ledgers systems. Another technique we can use instead is fixed point arithmetic. We implemented this with the Decimal and PreciseDecimal types. There are multiple ways to instantiate a Decimal: let a: Decimal = 10.into(); let b: Decimal = dec!(10); let c: Decimal = dec!("10.333"); let d: Decimal = Decimal::from(20); let e: Decimal = Decimal::from("20.123444"); Notice that you have to wrap numbers that have a fractional part in quotes. If you don’t, you will not be able to publish the package since it would contain floating-point numbers. This Decimal type represents a 192 bit fixed-scale decimal number that can have up to 18 decimal places. If you need even more precision, we provide the 256 bit PreciseDecimal type which allows up to 36 decimal places. Decimal and PreciseDecimal provide you some useful methods: You can check out the Decimal Scrypto Crate Rust Docs here (https://docs.rs/scrypto/1.2.0/scrypto/math/struct.Decimal.html) to see all of them in detail. Likewise the PreciseDecimal Scrypto Crate Rust Docs here (https://docs.rs/scrypto/1.2.0/scrypto/math/precise_decimal/struct.PreciseDecimal.html) Struct and Enums Rust struct and enum are also supported, as long as the fields are of the supported types. At this stage, no generics are supported for custom structs and enums. To use enums in Scrypto, you have to make them derive ScryptoSbor: #[derive(ScryptoSbor)] pub enum Color { White, Blue, Black, Red, Green, } Container Types In addition to basic types, the following container types are also supported: - Option: optional types - [T; N]: array types - (T, U, P, L, E): tuple types - Vec: dynamic-length vector type - BTreeSet, BTreeMap: B-Tree set and map - HashSet, HashMap: Hash set and map Scrypto Types Scrypto also introduces a few domain-specific types to enable asset-oriented programming. Types related to blueprints and components Type Description PackageAddress Represents the system-wide address of a Package. ComponentAddress Represents the system-wide address of a Component. Global Represents a reference to a global object (e.g Global). Globalizing Represents a local component to be globalized. Attached Represents an attached module to a global object (e.g Attached). Owned Represents an owned local component. KeyValueStore Represents a lookup table and the data it contains. It uses key-value pairs to store and retrieve data. Types related to cryptograpy Type Description Hash Represents a 32-byte hash digest. Currently, the only supported hash algorithm is SHA256. Secp256k1PublicKey Represents an ECDSA public key. Currently, the only supported curve is secp256k1. Secp256k1PrivateKey Represents an ECDSA signature. Currently, the only supported curve is secp256k1. Types related to Math Type Description Decimal Decimal type represents a 192 bit fixed-scale decimal number that can have up to 18 decimal places. PreciseDecimal If you need even more precision, we provide the 256 bit PreciseDecimal type which allows up to 36 decimal places Types related to Resources Type Description Bucket Represents a bucket of resources. Can be of fungible or non-fungible type. Resources in Scrypto can only be moved using Buckets. FungibleBucket Represents a bucket of fungible resource. This bucket can only contain fungible resource. NonFungibleBucket Represents a bucket of non-fungible resource. This bucket can only contain non-fungible resource. Proof Represents a proof of ownership of a resource. Can be a proof of a fungible resource or non-fungible resource. FungibleProof Represents a proof of a fungible resource. Can only be of a fungible resource. NonFungibleProof Represents a proof of a non-fungible resource. Can only be of a non-fungible resource. CheckedProof Represents a proof that has been validated to be legitimate at the application layer. Vault Represents a vault of resources. Resources in Scrypto can only be stored using Vaults. FungibleVault Represents a vault which contains a fungible resource. Can only contain fungible resource. NonFungibleVault Represents a vault which contains a non-fungible resource. Can only contain non-fungible resource. NonFungibleGlobalId Represents a system-wide address of a Non-Fungible Resource. NonFungibleLocalId Represents an Id of an Non-Fungible Resource. ResourceAddress Represents a system-wide address of a Resource. ## Advanced AccessRules URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/advanced-accessrules Updated: 2026-02-18 Summary: The Authorization section of this documentation repeatedly uses the rule!(…) macro to define an “access rule” - most often, a simple one (i.e. “require the call Build > Scrypto > Authorization > Advanced AccessRules — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) The Authorization section of this documentation repeatedly uses the rule!(…) macro to define an “access rule” - most often, a simple one (i.e. “require the caller to hold a certain resource”). However, under the hood, this macro creates an instance of AccessRule, which may potentially be a complex, nested tree-like structure. An example of a complex access rule definition could be: // Example rule to require at least one of: // (A) a super-admin signature // (B) proofs of 3 named approver badges // (C) proofs of 5 moderator badges and an enactment badge. rule!( require(NonFungibleGlobalId::from_public_key(super_admin_public_key)) || require_n_of( 3, vec![ NonFungibleGlobalId::new(named_approver_resource_address, NonFungibleLocalId::string("Adam")), NonFungibleGlobalId::new(named_approver_resource_address, NonFungibleLocalId::string("Bethany")), NonFungibleGlobalId::new(named_approver_resource_address, NonFungibleLocalId::string("Catherine")), NonFungibleGlobalId::new(named_approver_resource_address, NonFungibleLocalId::string("Daniel")), NonFungibleGlobalId::new(named_approver_resource_address, NonFungibleLocalId::string("Emily")), ], ) || ( require_amount(moderator_badge_resource_address, dec!(5)), && require(enactment_badge_resource_address) ) ) AccessRule structure At the top-level, an AccessRule may statically allow/disallow all access, or define some specific requirements regarding the Proofs present in the Authentication Zone: // Always allows access: (the default behavior) rule!(allow_all) // Never allows access: rule!(deny_all) // Allows access if and only if the Authorization Zone contains Proofs // matching certain requirements: rule!() Underlying data model // Rust (post-Cuttlefish) enum AccessRule { AllowAll, DenyAll, Protected(CompositeRequirement), } // SBOR (and Rust pre-Cuttlefish) v SBOR Enum Discriminator enum AccessRule { AllowAll, // 0 DenyAll, // 1 Protected(AccessRuleNode), // 2 } Composite Requirement A CompositeRequirement itself is a boolean-like expression tree built from Basic Requirements. It can be: - A single Basic Requirement (defined below - but roughly require_*() conditions) - Any Of a list of Composite Requirements - All Of a list of Composite Requirements In Scrypto, we can define composite rules using basic logic operators (&& and || along with (..) brackets for grouping), and the composite rule is built for us automatically. Some examples of rules with composite requirements follow: // In the below, A, B and C are any composite or basic requirements. // Requires both A and B to be satisfied: rule!(A && B) // Requires (at least) one of A or B or C to be satisfied: rule!(A || B || C) // Follows standard boolean operator precedence: (i.e. requires A, or both B and C) rule!(A || B && C) // Supports parentheses: (i.e. requires A or B, and additionally C) rule!((A || B) && C) // Logical Negation is NOT supported: // rule!(A && !B) - does NOT compile Limits To protect resource usage, we enforce the following limits within any AccessRule‘s expression: - Maximum depth of a logical expression tree = 8. - Intuitively, the “depth” corresponds to the number of nested expressions that use different operators. - For example, (a && b) || (c && d || e) is of depth = 3, while a || b || c || d || e is of depth = 1 (since the chained || counts as a single, multi-input “or” expression). - This prevents excessive native stack usage. - Maximum number of nodes in a logical expression tree = 64. - Intuitively, each operation and each leaf condition counts as a node. - For example, (a && b) || (c && d || e) has 9 nodes, while a || b || c || d || e has 6 nodes (since the chained || counts as a single, multi-input “or” expression). - This prevents excessive number of condition evaluations during AccessRule checks. Underlying data model CompositeRequirement was historically known as an AccessRuleNode, and for backwards compatibility, is called an AccessRuleNode in SBOR programmatic JSON. // Rust (post-Cuttlefish) enum CompositeRequirement { BasicRequirement(BasicRequirement), AnyOf(Vec), AllOf(Vec), } // SBOR (and Rust pre-Cuttlefish) v SBOR Enum Discriminator enum AccessRuleNode { ProofRule(ProofRule), // 0 AnyOf(Vec), // 1 AllOf(Vec), // 2 } Basic Requirements BasicRequirements are the leaf nodes of an AccessRule. Below is a summary of all available BasicRequirements: // Requires a proof of a non-zero amount of the given item to be present in the Authorization Zone: require() // Requires a proof of (at least) the specified amount A of the given resource: require_amount(dec!(A), ) //----- // Requires proofs of a non-zero amount of at least one of the given items: require_any_of(vec![, , ...])) // Requires proofs of a non-zero amount of all of the given items: require_all_of(vec![, , ...]) // Requires proofs of a non-zero amount of (at least) N of the given items: require_n_of(N, vec![, , ...]) Underlying data model BasicRequirement was historically known as ProofRule, and for backwards compatibility, is called an ProofRule in SBOR programmatic JSON. // Rust (post-Cuttlefish) enum CompositeRequirement { Require(ResourceOrNonFungible), AmountOf(Decimal, ResourceAddress), CountOf(u8, Vec), AllOf(Vec), AnyOf(Vec), } // SBOR (and Rust pre-Cuttlefish) v SBOR Enum Discriminator enum ProofRule { Require(ResourceOrNonFungible), // 0 AmountOf(Decimal, ResourceAddress), // 1 CountOf(u8, Vec), // 2 AllOf(Vec), // 3 AnyOf(Vec), // 4 } Resource Address or Non-Fungible Most basic requirements take a resource address (fungible OR non-fungible) or a specific non-fungible global id (../../../reference/standards/non-fungible-standards/non-fungible-display.md#nonfungible-global-id-address-display) . // Requires any non-zero amount of a certain fungible resource: rule!(require(FUNGIBLE_RESOURCE.resource_address())) // Requires any non-zero amount of a certain non-fungible resource: rule!(require(NON_FUNGIBLE_RESOURCE.resource_address())) // Requires a non-fungible resource instance of a specific ID: rule!( require( NonFungibleGlobalId::new( NON_FUNGIBLE_RESOURCE.resource_address(), NonFungibleLocalId::CONCRETE_ID ) ) ) Notes: - The require_n_of is a generalization of the common special cases: - require_any_of(resources) == require_n_of(1, resources); - require_all_of(resources) == require_n_of(resources.len(), resources); - require(resource) == require_n_of(1, vec![resource]). - The require_amount may be applied both to fungible resources (where the dec!(A) minimum value is treated literally), and to non-fungible ones (where, in practice, at least ceil(dec!(A)) non-fungible instances of the given resource are required). In either case, at present, the entire amount must be found in a single proof present in the Authorization Zone - in other words: the rule-checking logic does not sum amounts (or instance counts!) coming from different Proof structures. This may change in future versions of the Radix Engine to allow combining of amounts from separate proofs (by taking a union of the underlying proved resources). Data model structure // Rust and SBOR v SBOR Enum Discriminator enum ResourceOrNonFungible { NonFungible(NonFungibleGlobalId), // 0 Resource(ResourceAddress), // 1 } Implicit Requirements Access rules can be used to require a proof of an explicit resource or non-fungible is present on the authorization zone. This is sufficient for implementing many custom authorization schemes, where a dApp developer first defines a specialized “badge” resource, and then references it in access rules (see e.g. the admin_badge resource within our User Badge Pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/user-badge-pattern.md) example). But access rules can also include requirements on implicit proofs under special system-reserved resource addresses, which have special meanings to the Radix Engine and aren’t part of the standard authorization zone. Signature Requirements For each public key which signed a given notarized transaction, the system creates an implicit non-fungible proof of NonFungibleLocalId::bytes(LOWER_29_BYTES_OF_BLAKE256B_HASH_OF_PUBLIC_KEY_BYTES) in the authorization zone of the transaction processor (transaction-processor) call frame. The non-fungible proof is created under one of the following special reserved non-fungible resource addresses (see well-known native addresses (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/well-known-addresses.md) ): - Secp256k1 Signature Resource ( stokenet (/contents/tech/releases/stokenet) , mainnet (https://dashboard.radixdlt.com/resource/resource_rdx1nfxxxxxxxxxxsecpsgxxxxxxxxx004638826440xxxxxxxxxsecpsg) ) - Ed25519 Signature Resource ( stokenet (/contents/tech/releases/stokenet) , mainnet (https://dashboard.radixdlt.com/resource/resource_rdx1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxxed25sg) ) Due to the single-global-frame visibility rule for access rules, signature proofs are only visible to global frames started as a direct call from the transaction manifest. This rule is an important one for security - and ensures only calls seen in the manifest by the signer can see the signature. For example, it prevents a malicious component from using a signature to authorize a withdrawal. // It's most performant to store/use the public key hash. let public_key_hash = PublicKeyHash::Secp256k1(LOWER_29_BYTES_OF_BLAKE256B_HASH_OF_PUBLIC_KEY_BYTES); let public_key_hash = PublicKeyHash::Ed25519(LOWER_29_BYTES_OF_BLAKE256B_HASH_OF_PUBLIC_KEY_BYTES); // You can also use a public key, but in this case, the hash has to be performed in Scrypto, which // will use more execution cost units let public_key = PublicKey::Secp256k1(...); let public_key = PublicKey::Ed25519(...); // Post-Cuttlefish: rule!(require(signature())); // Pre-Cuttlefish: rule!(require(NonFungibleGlobalId::from_public_key())) rule!(require(NonFungibleGlobalId::from_public_key_hash())) Caller Requirements For advanced use only As a general rule, explicit requirements are clearer and more flexible - we advise only using caller requirements where they are strictly needed to meet requirements of your application. The system creates some implicit proofs which can be used to verify the caller of the given method/function. These non-fungible proofs are created with a byte-based local id, containing a hash of a descriptor of the address / global caller descriptor, under the following special reserved resource addresses (see well-known native addresses (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/well-known-addresses.md) ): - Package of Direct Caller Resource ( stokenet (/contents/tech/releases/stokenet) , mainnet (https://dashboard.radixdlt.com/resource/resource_rdx1nfxxxxxxxxxxpkcllrxxxxxxxxx003652646977xxxxxxxxxpkcllr) ) - NonFungibleLocalId::bytes(BLAKE256B_HASH_OF_SCRYPTO_ENCODED_PACKAGE_ADDRESS) - Global Caller Resource ( stokenet (/contents/tech/releases/stokenet) , mainnet (https://dashboard.radixdlt.com/resource/resource_rdx1nfxxxxxxxxxxglcllrxxxxxxxxx002350006550xxxxxxxxxglcllr) ) - NonFungibleLocalId::bytes(BLAKE256B_HASH_OF_SCRYPTO_ENCODED_GLOBAL_CALLER_ENUM) // Requires that the immediate caller (i.e. the actor which made the // latest global or internal call) has exactly the given package: rule!(require(package_of_direct_caller())) // Requires that this code was directly called from the given component: // (more specifically: that the global ancestor of the actor who made the // latest global call is the main module of the given global component) rule!(require(global_caller())) // Requires that the global ancestor of the actor who made the latest // global call is a package function on the given blueprint: rule!(require(global_caller(BlueprintId::new(, ""))))) Note that the global caller / package of direct caller cannot be read from the system - but rather has to be provided by the caller as an argument. This is for two reasons: - All intent and semantics should be captured by arguments. The auth layer is “opt-in”. It should be possible to run a transaction without authorization checks enabled, and get the same result as if run with authorization checks and all checks pass. - The “caller” pattern in other chains has been seen to be easy to mis-apply, and resulted in various hacks. On Radix, we believe the resource-based auth model is more flexible and safer from hacks. The below gives an example pattern on the receiver which can verify the caller. On the caller end, you can get your own address with Runtime::global_address() and pass that in as an argument. fn handle_external_call(&self, claimed_caller_address: ComponentAddress, ...) { // Verify that the claimed_caller_address is correct Runtime::assert_access_rule(rule!(require(global_caller(claimed_caller_address>))); let global_caller_address = claimed_caller_address; // Check the component address against some kind of allow-list if !self.is_component_authorized(global_caller_address) { panic!("Caller is not authorized"); } // ... } System Execution Requirements You are unlikely to need these! These proofs are only present during system transactions, which only touch native blueprints. During user transactions, these requirements will always fail. For special system transactions, the system adds these proofs to the transaction processor call frame: - System Execution Resource ( stokenet (/contents/tech/releases/stokenet) , mainnet (https://dashboard.radixdlt.com/resource/resource_rdx1nfxxxxxxxxxxsystxnxxxxxxxxx002683325037xxxxxxxxxsystxn) ) - SystemExecution::Protocol is present during protocol update transactions. Although auth is actually entirely disabled during genesis bootstrapping. It is represented by an implicit proof of NonFungibleLocalId::integer(0) under the system execution resource. - SystemExecution::Validator is present in round/epoch change transactions. It is represented by an implicit proof of NonFungibleLocalId::integer(1) under the system execution resource. // Post-Cuttlefish rule!(require(system_execution(SystemExecution::Validator))) rule!(require(system_execution(SystemExecution::Protocol))) // Pre-Cuttlefish rule!(require(AuthAddresses::protocol_role())) rule!(require(AuthAddresses::validator_role())) ## Assign Component Royalty Roles URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/assign-component-royalty-roles Updated: 2026-02-18 Summary: Components may optionally include Component Royalties. If so, a set of component royalty roles will be defined for that component: Build > Scrypto > Authorization > Assign Component Royalty Roles — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-component-royalty-roles.md) Assign AccessRules for Component Royalty Roles Components may optionally include Component Royalties. If so, a set of component royalty roles will be defined for that component: Role Authority Description Methods Accessible royalty_setter Update royalty amount for a method ComponentRoyalty::set_royalty royalty_setter_updater Update the AccessRule of the royalty_setter RoleAssignment::set(ModuleId::Royalty, "royalty_setter", ..) royalty_locker Lock the royalty amount for a method such that they are no longer updateable ComponentRoyalty::lock_royalty royalty_locker_updater Update the AccessRule of the royalty_locker RoleAssignment::set(ModuleId::Royalty, "royalty_locker", ..) royalty_claimer Withdraw the royalties accumulated by the component ComponentRoyalty::claim_royalties royalty_claimer_updater Update the AccessRule of the royalty_claimer RoleAssignment::set(ModuleId::Royalty, "royalty_claimer", ..) By default, the Owner Role will inherit all Metadata Roles. Assign Custom AccessRules for Component Royalty Roles If custom access rules for each component royalty role are required (rather than the Owner role inheriting all roles) then add roles to the component_royalties! macro during component globalization: #[blueprint] mod my_token_sale { enable_method_auth! { roles { super_admin_role => updatable_by: []; admin_role => updatable_by: [super_admin_role]; }, methods { .. } } struct MyTokenSale { .. } impl MyTokenSale { pub fn create() { let owner_badge: Bucket = { .. }; let owner_access_rule: AccessRule = { .. }; let royalty_setter_access_rule: AccessRule = { .. }; let royalty_locker_access_rule: AccessRule = { .. }; let royalty_locker_updater_access_rule: AccessRule = { .. }; MyTokenSale { .. } .instantiate() .prepare_to_globalize(OwnerRole::Fixed(owner_access_rule)) .enable_component_royalties(component_royalties! { roles { royalty_setter => royalty_setter_access_rule; // #1 royalty_setter_updater => rule!(deny_all); // #2 royalty_locker => royalty_locker_access_rule; royalty_locker_updater => royalty_locker_updater_access_rule; royalty_claimer => OWNER; // #3 royalty_claimer_updater => OWNER; }, init { .. } }) .globalize() .. } .. } } - The royalty_setter_access_rule is assigned to the royalty_setter role - The royaty_setter_updater role is not accessible by anyone effectively “locking” in the AccessRule of royalty_setter - Using OWNER specifies that the owner role will inherit the royalty_claimer role ## Assign Metadata Roles URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/assign-metadata-roles Updated: 2026-02-18 Summary: Every component has Metadata and a set of Metadata Roles: Build > Scrypto > Authorization > Assign Metadata Roles — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-metadata-roles.md) Assign AccessRules for Metadata Roles Every component has Metadata and a set of Metadata Roles: Role Authority Description Methods Accessible metadata_setter Update a metadata entry Metadata::set(..) Metadata::remove(..) metadata_setter_updater Update the AccessRule of the metadata_setter RoleAssignment::set(ModuleId::Metadata, "metadata_setter", ..) metadata_locker Lock metadata entries such that they are no longer updateable Metadata::lock(..) metadata_locker_updater Update the AccessRule of the metadata_locker RoleAssignment::set(ModuleId::Metadata, "metadata_locker", ..) By default, the Owner Role will inherit all Metadata Roles. Assign Custom AccessRules for Metadata Roles If custom access rules for each metadata role are required (rather than the Owner role inheriting all roles) then add roles to the metadata! macro during component globalization: #[blueprint] mod my_token_sale { enable_method_auth! { roles { super_admin_role => updatable_by: []; admin_role => updatable_by: [super_admin_role]; }, methods { .. } } struct MyTokenSale { .. } impl MyTokenSale { pub fn create() { let owner_badge: Bucket = { .. }; let owner_access_rule: AccessRule = { .. }; let metadata_setter_access_rule: AccessRule = { .. }; MyTokenSale { .. } .instantiate() .prepare_to_globalize(OwnerRole::Fixed(owner_access_rule)) .metadata(metadata! { roles { metadata_setter => metadata_setter_access_rule.clone(); // #1 metadata_setter_updater => metadata_setter_access_rule; metadata_locker => OWNER; // #2 metadata_locker_updater => rule!(deny_all); // #3 } }) .globalize() .. } .. } } - The metadata_setter_access_rule is assigned to the metadata_setter role - Using OWNER specifies that the owner role will inherit the metadata_locker role - The metadata_locker_updater role is not accessible by anyone effectively “locking” in the AccessRule of metadata_locker ## Assign Roles To Resources URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/assign-roles-to-resources Updated: 2026-02-18 Summary: There are a fixed selection of roles for resources. These roles belong to each of their behaviours. You can find out more about these behaviours, their associat Build > Scrypto > Authorization > Assign Roles To Resources — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-roles-to-resources.md) There are a fixed selection of roles for resources. These roles belong to each of their behaviours. You can find out more about these behaviours, their associated roles and how to assign them in Resource Behaviors (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) ## Assign Component Roles URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/assign-component-roles Updated: 2026-02-18 Summary: The conditions for proving if a caller has a role is defined by an AccessRule. During component instantiation each role defined for the blueprint will be assign Build > Scrypto > Authorization > Assign Component Roles — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-component-roles.md) Assign Component Roles The conditions for proving if a caller has a role is defined by an AccessRule. During component instantiation each role defined for the blueprint will be assigned an AccessRule. Each AccessRule defines a set of resources, called badges, which the caller must prove they have access to. Create a Badge Resource A Badge is simply a resource (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/README.md) used for authorization in an AccessRule. It is no different from any other resource and in fact existing resources may be used as a badge. To create a badge resource from scratch use the resource builder pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-creation-in-detail.md) : #[blueprint] mod my_token_sale { enable_method_auth! { .. } struct MyTokenSale { .. } impl MyTokenSale { pub fn create() { let owner_badge: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) .divisibility(DIVISIBILITY_NONE) .mint_initial_supply(1); .. } .. } } Create an Access Rule An AccessRule defines the set of resources which a caller must prove they have access to in order to pass auth. To create an AccessRule use the rule! macro: #[blueprint] mod my_token_sale { enable_method_auth! { .. } struct MyTokenSale { .. } impl MyTokenSale { pub fn create() { let owner_badge: Bucket = { .. }; let access_rule: AccessRule = rule!(require(owner_badge.resource_address())); // #1 .. } .. } } - Using rule! along with require sets the rule that a caller must show proof of any non-zero amount of the owner badge resource It is possible to create advanced access rules which define more complex sets of resources. This is documented here (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) . Assign an AccessRule to the Owner Role The owner role is a special role which every component must define on instantiation. It is special as it inherits the role of any role which isn’t assigned an AccessRule. The Owner Role has three variations: Variation Description None No one may claim they are the owner of the component and this is immutable. Fixed There is an owner of the component provable by the assigned AccessRule and this AccessRule is immutable. Updatable There is an owner of the component provable by the assigned AccessRule and this AccessRule is updatable by the owner. To set an AccessRule for the Owner Role, specify the owner role with access rule in the prepare_to_globalize method during component globalization: #[blueprint] mod my_token_sale { enable_method_auth! { roles { super_admin_role => updatable_by: []; admin_role => updatable_by: [super_admin_role]; }, methods { .. } } struct MyTokenSale { .. } impl MyTokenSale { pub fn create() { let owner_badge: Bucket = { .. }; let access_rule: AccessRule = { .. }; MyTokenSale { .. } .instantiate() .prepare_to_globalize(OwnerRole::Fixed(access_rule)) // #1 .globalize() .. } .. } } - A fixed owner role is assigned. Since no access rules have been defined for any custom roles, the owner role automatically inherits the super_admin_role and admin_role roles. Assign Custom AccessRules for Custom Roles If custom access rules for each role are required (rather than the Owner role inheriting all roles) then use the roles! macro during component globalization: #[blueprint] mod my_token_sale { enable_method_auth! { roles { super_admin_role => updatable_by: []; admin_role => updatable_by: [super_admin_role]; }, methods { .. } } struct MyTokenSale { .. } impl MyTokenSale { pub fn create() { let owner_badge: Bucket = { .. }; let owner_access_rule: AccessRule = { .. }; let admin_access_rule: AccessRule = { .. }; MyTokenSale { .. } .instantiate() .prepare_to_globalize(OwnerRole::Fixed(owner_access_rule)) .roles(roles! { admin => admin_access_rule; // #1 super_admin => OWNER; // #2 }) .globalize() .. } .. } } - The admin_access_rule is assigned to the admin role - Using OWNER specifies that the owner role will inherit the super_admin role ## Assign Roles To Methods URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/structure-roles-and-methods Updated: 2026-02-18 Summary: Methods, unlike functions, use roles to further structure how AccessRules are assigned. Build > Scrypto > Authorization > Assign Roles To Methods — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/structure-roles-and-methods.md) Methods, unlike functions, use roles to further structure how AccessRules are assigned. Roles and the methods each role may access are defined statically for each blueprint. This means that every component of a given blueprint will have the same role/method structure as every other component of the same blueprint. Use the Owner Role and Self Role Every component will always have an Owner Role and a Self Role. The Owner Role is a role which must be defined for every component instantiation. The Self Role is a special role which refers to the component itself. To only allow a subset of methods to be accessed by the owner or self role use the enable_method_auth! macro at the top of your blueprint code: #[blueprint] mod my_token_sale { enable_method_auth! { methods { // #1 buy => PUBLIC; // #2 create_admin => restrict_to: [OWNER, SELF]; // #3 change_price => restrict_to: [OWNER]; redeem_profits => restrict_to: [OWNER]; } } struct MyTokenSale { .. } impl MyTokenSale { pub fn buy(&mut self) { .. } pub fn create_admin(&mut self) { .. } pub fn change_price(&mut self) { .. } pub fn redeem_profits(&mut self) { .. } .. } } - Each pub method in impl MyTokenSale { .. } must be assigned which roles may access it. Non-pub methods are never accessible. - Use PUBLIC to allow anyone to call a method. In this case, buy may be called by anyone(). - Use restrict_to to specify which roles may call a method. In this case, create_admin is restricted to callers who can prove they have the OWNER role, or the component itself (SELF). Use Custom Roles Each blueprint may also define it’s own custom roles to have greater control of authorization. To do so, add roles to the enable_method_auth! macro: #[blueprint] mod my_token_sale { enable_method_auth! { roles { super_admin_role => updatable_by: []; // #1 admin_role => updatable_by: [super_admin_role]; }, methods { buy => PUBLIC; create_admin => restrict_to: [super_admin]; // #2 change_price => restrict_to: [admin, super_admin]; redeem_profits => restrict_to: [OWNER]; } } struct MyTokenSale { .. } impl MyTokenSale { pub fn buy(&mut self) { .. } pub fn create_admin(&mut self) { .. } pub fn change_price(&mut self) { .. } pub fn redeem_profits(&mut self) { .. } .. } } - Each custom role is associated with a set of roles which specify which roles may update the conditions of that role. In this case, super_admin_role has an empty set effectively setting it’s AccessRules to be immutable once defined. The admin_role on the other hand may be updated by the super_admin_role. - Custom roles may then be used when defining which roles a method may be accessed by. Ignore Auth and Create a Fully Public Blueprint It may be that Auth is unnecessary for your blueprint and that every method should be accessible by everyone. In this case, NOT including enable_method_auth! macro makes every method public. #[blueprint] mod my_faucet { struct MyFaucet { .. } impl MyFaucet { pub fn take(&mut self) { .. } // #1 .. } } - Since enable_method_auth! is not used the take method is callable by anyone ## Assign Function AccessRules URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/assign-function-accessrules Updated: 2026-02-18 Summary: To specify the access rules for each function use the enablefunctionauth! macro at the top of your blueprint code. Build > Scrypto > Authorization > Assign Function AccessRules — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-function-accessrules.md) To specify the access rules for each function use the enable_function_auth! macro at the top of your blueprint code. NOTE If the enable_function_auth! macro is not used that all functions will default to rule!(allow_all) AccessRule. #[blueprint] mod my_token_sale { enable_function_auth! { // #1 create_component => rule!(allow_all); // #2 create_special_component => rule!(require(XRD)); // #3 } struct MyTokenSale { .. } impl MyTokenSale { pub fn create_component() -> Global { .. } pub fn create_special_component() -> Global { .. } .. } } - Each pub function in impl MyTokenSale { .. } must be assigned an AccessRule if the enable_function_auth! macro is used. Non-pub functions are never accessible. - rule!(allow_all) specifies that anyone may call the create_component function - rule!(require(XRD)) specifies that only a caller who has a proof of XRD in their AuthZone may call the create_special_component function ## Using Proofs URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/using-proofs Updated: 2026-02-18 Summary: Proofs give us a way to tell if a resource exists in a Vault or Bucket without having to remove it. A Proof can be created from a resource and used instead, whi Build > Scrypto > Authorization > Using Proofs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/using-proofs.md) Proofs give us a way to tell if a resource exists in a Vault or Bucket without having to remove it. A Proof can be created from a resource and used instead, which allows us to know if an actor possesses a resource without the resource being sent. Proofs can be useful for a variety of reasons, but first and foremost is authorization. The most common version of this is creating a Proof of a badge to Call a Protected Method/Function (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/call-a-protected-method-function.md) on a component, e.g. an owner badge that's required for the withdrawal of collected XRD from a Gumball Machine component (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-give-the-gumball-machine-an-owner.md) . Proofs are Transient Proofs can only exist for the duration of a transaction. Proof Lifecycle Proofs can be created, transferred, and dropped. Proofs can be passed around as tangible objects like Buckets, or put on/taken off an AuthZone. Implicit proofs (advanced-accessrules.md#implicit-requirements) only live on the AuthZone and can't be created as tangible Proofs like others. Only Proofs in an AuthZone can be used to meet authorization checks in called methods/functions. If checks fail, the transaction will abort, otherwise it proceeds as normal. Just as with Buckets and Vaults (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/buckets-and-vaults.md) there are multiple Scrypto types to refer to tangible Proofs: - Proof - A general Proof type for fungible or non-fungible resources - FungibleProof - A Proof of fungible resources - NonFungibleProof - A Proof of non-fungible resources How does an authorization check work? Authorization checks happen when a protected method (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/structure-roles-and-methods.md) or function (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-function-accessrules.md) is called, or when Runtime::assert_access_rule(rule!(...)) is called directly on an access rule (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) . This check works by comparing the requirements in the access rule against Proofs in the parent LocalAuthZone. Technically speaking, it's not just the parent AuthZone which is used - the contents of the LocalAuthZone of all the ancestor local callframes in the current and parent global callframe are used. Creating Proofs You can create Proofs from Vaults or Buckets. Proofs lock resources in Vaults and Buckets Resources are locked in their Vault or Bucket while a Proof of them exists to guarantee the presenter of the Proof has possession of the original resource. Creating Proofs in Scrypto You can create Proofs from Vaults or Buckets with several different Scrypto methods. // Make an IndexSet of one NonFungibleLocalId for our NonFungibleProofs let non_fungible_local_ids = indexset![NonFungibleLocalId::string("example_id").unwrap()]; // Proofs from Vaults let proof_1: FungibleProof = self.fungible_vault.create_proof_of_amount(1); let proof_2: NonFungibleProof = self .non_fungible_vault .create_proof_of_non_fungibles(&non_fungible_local_ids); // Proofs from Buckets let proof_3: Proof = bucket.create_proof_of_all(); let proof_4: FungibleProof = fungible_bucket.create_proof_of_all(); let proof_5: FungibleProof = fungible_bucket.create_proof_of_amount(1); let proof_6: NonFungibleProof = non_fungible_bucket.create_proof_of_all(); let proof_5: NonFungibleProof = non_fungible_bucket.create_proof_of_non_fungibles(&non_fungible_local_ids); You can also make Proofs from Proofs: let proof_2 = proof_1.clone(); Creating Proofs in the manifest Proofs can be sourced from component calls. Commonly you create Proofs from your Account (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) . There are two main Account methods for this, depending on the resource type: - Fungible: create_proof_of_amount - Non-fungible: create_proof_of_non_fungibles These can be used as follows: CALL_METHOD Address("${ACCOUNT_ADDRESS}") "create_proof_of_amount" Address("${FUNGIBLE_OWNER_BADGE_ADDRESS}") Decimal("1"); CALL_METHOD Address("${ACCOUNT_ADDRESS}") "create_proof_of_non_fungibles" Address("${NON_FUNGIBLE_OWNER_BADGE_ADDRESS}") Array(NonFungibleLocalId("{BADGE_LOCAL_ID}")); When you receive Proofs from a manifest call, they are automatically placed on the Auth Zone. There are also lots of manifest instructions for creating, moving and dropping Proofs, such as CREATE_PROOF_FROM_BUCKET_OF_ALL and CREATE_PROOF_FROM_BUCKET_OF_NON_FUNGIBLES. These and other such instructions are listed with examples in Manifest Instructions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) . Transferring Proofs Transferring to/from the AuthZone Tangible Proofs can be put on and popped off the AuthZone: LocalAuthZone::push(proof); let proof = LocalAuthZone::pop().unwrap(); In the Transaction manifest, Proofs returned from methods go straight onto the AuthZone. If necessary, Proofs can be taken off with POP_FROM_AUTH_ZONE and put back with PUSH_TO_AUTH_ZONE: PUSH_TO_AUTH_ZONE Proof("proof"); POP_FROM_AUTH_ZONE Proof("popped_proof"); # Same proof, but it needs a new name Returning a Proof from a call Tangible Proofs can be returned freely from a method to a caller. Be careful - this allows the caller to use the Proof for authorization, so only return Proofs to callers you trust. Passing a Proof to a call Tangible Proofs can be passed to a method call, to facilitate "proof by intent" discussed later in this article. However, to protect users, Proofs passed by intent to a non-internal call become restricted. Restricted Proofs cannot be put on the AuthZone or be passed by intent to another non-internal call. Why do Proofs get restricted? This allows you to safely pass a Proof by intent without fear it will get misused: Conceptually, you can show your badge to someone, but they can't use the badge themselves. This means we can give users much better guarantees about what they are authorizing from a manifest. We are aware this restriction mechanism causes some friction. We are planning an expansion to the authorization system ( "allowances" (https://discord.com/channels/417762285172555786/1186424910365605898) ) which will transiently permit actions non-locally. Authorizing callers of your method When authorizing a caller, you'll use one of two strategies: - Verify the role of your caller - Typically, you want to check that your caller has a particular role, for example: "check that your caller is an admin" or "check that your caller is a user". - Verify precisely who your caller is - Occasionally, you will want to verify precisely who your caller is, for example "check the caller is a particular user" or "check which component has called your component". To verify precisely who your caller is, you must require they pass details of who they are and then validate those details are accurate. There are two common patterns for this covered below. If possible, we recommend verifying the role of your caller because this is easier and more flexible. Why do we require a caller to tell us who they are? For precise verifications, the caller must pass in details of who they are in their arguments. This is because we have designed the authorization layer to not affect how a transaction executes. Authorization can only cause the transaction to fail when enabled. This has a few benefits: - Conceptual simplicity: It's easier to reason about how a method behaves if only arguments and component state can affect its execution. - Practical use-cases: It allows authorization to be disabled in the engine for use cases such as preview or testing. - Reproducibility: We want the result of preview/execution to not depend on exactly who signed a transaction. Signatures should just enable a transaction to succeed, not affect its execution. Verify the role of your caller with the AuthZone The standard method to protect your methods is by assigning roles to methods (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/structure-roles-and-methods.md) using Scrypto's role based access control. For example, this might look like: enable_method_auth! { methods { withdraw_earnings => restrict_to: [OWNER]; } } // --snip-- .prepare_to_globalize( OwnerRole::Fixed(rule!(require(owner_badge.resource_address()))) ) Verify your caller precisely with the AuthZone If your badge resource is non-fungible you can authorize the method and retrieve the non-fungible data from it. To do that you: - Pass the non-fungible local ID as one of your Scrypto function's arguments - Check a Proof with that ID and resource address is on the Auth Zone with assert_access_rule - Use the ID to get the non-fungible data. fn get_non_fungible_data( &self, badge_local_id: NonFungibleLocalId, ) -> OwnerBadgeData { // assemble Proof global_id let global_id = NonFungibleGlobalId::new(self.owner_badge_address, badge_local_id.clone()); // check that a Proof of the non-fungible resource is in the AuthZone Runtime::assert_access_rule(rule!(require(global_id))); // get the data from the non-fungible resource let non_fungible_data = ResourceManager::from(self.admin_badge_address) .get_non_fungible_data::(&badge_local_id); // return the non-fungible data non_fungible_data } This also gives us access to our non-fungible's local ID as it was passed in as a method argument. This pattern can also be used to validate the calling component using the global_caller implicit requirement (advanced-accessrules.md#caller-requirements) . Methods Protected with assert_access_rule A Runtime::assert_access_rule call, protects a method so it often wont also need to be in the enable_method_auth! macro at the top of your blueprint. However the macro is still a good way to see which methods are protected at a glance, so we encourage you to still add a user role check, even if also manually checking a user badge. Verify your caller precisely with a proof by intent The general pattern is to accept a tangible Proof as a parameter, and then check that the proof is of the expected resource. Typically, this will be done in three steps: Checking Proofs If Proofs aren't placed on the Auth Zone you will need to check that they are a from the correct resource before you do anything else with them. fn check_admin_proof(&self, admin_proof: Proof) -> CheckedProof { admin_proof.check(self.admin_badge_address) } A check like this will panic and the transaction will fail if the resources address of the Proof and the .check argument do not match. Getting Non-fungible Data from Proofs With our Proof checked we can retrieve any non-fungible data from it. e.g. If we've derived the following AdminBadgeData for our badge and added it to our Scrypto package, outside the blueprint, with code like this. #[derive(ScryptoSbor, NonFungibleData)] struct AdminBadgeData { admin_id: String, } We could then retrieve the admin_id data field like so. fn get_admin_id(&self, admin_proof: NonFungibleProof) -> String { // check the proof and retrieve the non-fungible data let non_fungible_data = admin_proof // check the proof .check(self.admin_badge_address) // retrieve data .non_fungible::() .data(); // return the admin id non_fungible_data.admin_id } Getting Non-fungible IDs from Proofs Sometimes we might need to retrieve the a non-fungible ID from a Proof. In this example we've stored our admin IDs as the non-fungible local IDs of our badges instead of in a data field. We can retrieve it like so. fn get_admin_id(&self, admin_proof: NonFungibleProof) -> NonFungibleLocalId { // check the proof then retrieve and return the non-fungible local ID let non_fungible_id = admin_proof // check the proof .check(self.admin_badge_address) // retrieve non-fungible local ID .non_fungible_local_id(); // return the non-fungible local ID non_fungible_id } Or if we need to unwrap the type to something other than a NonFungibleLocalId we can use some more advanced Rust pattern matching. fn get_admin_id(&self, admin_proof: NonFungibleProof) -> String { // check the proof and retrieve the non-fungible local ID let non_fungible_id_string = match admin_proof // check the proof .check(self.admin_badge_address) // retrieve non-fungible local ID .non_fungible_local_id() { // if it has a String type local ID return it as a String NonFungibleLocalId::String(local_id) => local_id.value().to_owned(), // We know the local ID type as we minted the badge, // so other possibilities are unreachable _ => unreachable!("All admin badges have String local IDs"), }; // return the non-fungible local ID as a string non_fungible_id_string } Using proofs to call methods When calling a protected method, you will need to prepare your proofs: - Typically, you will need to ensure proofs are on your AuthZone - If the method takes a proof by intent, you may also need to pass a tangible proof. This proof may need to be cloned if it also needs to be on your AuthZone to pass an AuthZone check. Calling AuthZone protected methods in Scrypto To call an AuthZone protected method, you will need to put the required proofs on your AuthZone. You can do that as follows: pub fn withdraw_earnings(&mut self, owner_badge: Proof) -> FungibleBucket { // place the proof on the local auth zone authorizing methods called within // this one LocalAuthZone::push(owner_badge); // withdraw XRD collected from the gumball machine, authorized by the owner // badge proof let earnings = self.gumball_machine_component.withdraw_earnings(); // remove the proof from the local auth zone to prevent unauthorized method // calls let proof = LocalAuthZone::pop().unwrap(); // The proof can be dropped manually, or will automatically be dropped // when this component returns proof.drop(); // return the earnings earnings } Alternatively, there is an abbreviated form that automates adding and removing the Proof to the AuthZone: pub fn withdraw_earnings(&mut self, owner_badge: Proof) -> FungibleBucket { // place the proof on the local auth zone authorizing methods called within // its closure method, then remove it owner_badge.authorize(|| { // withdraw XRD collected from the gumball machine, authorized by the // owner badge proof self.gumball_machine_component.withdraw_earnings() }) } For convenience, the authorize method also exists on Vaults and Buckets. This method creates a Proof, calls authorize with it, and then drops it. Passing a Proof by intent In Scrypto, you can just pass a non-restricted tangible Proof in the arguments of a cross-component call (cross-blueprint-calls) . In transaction manifests, to provide Proofs as method arguments, you will usually just pop the latest added Proof from the AuthZone and then call your method, as follows: POP_FROM_AUTH_ZONE Proof("badge_proof"); CALL_METHOD Address("${COMPONENT_ADDRESS}") "method_name" Proof("badge_proof"); For some dApps, you will also need to keep a copy of the Proof on your AuthZone to pass an AuthZone based access check, AND also pass the same Proof by intent. To do this, you can use one of two techniques. The easiest is to simply use CREATE_PROOF_FROM_AUTH_ZONE_OF_ALL to create a separate tangible Proof for passing into the method, which will be backed by the same resources as the Proof/s on the AuthZone: CREATE_PROOF_FROM_AUTH_ZONE_OF_ALL Address("${BADGE_RESOURCE_ADDRESS}") Proof("badge_proof"); CALL_METHOD Address("${COMPONENT_ADDRESS}") "method_name" Proof("badge_proof"); Or if you don't know the resource address, you can pop it from the AuthZone, explicitly clone the Proof and return it to the AuthZone: POP_FROM_AUTH_ZONE Proof("badge_proof_1"); CLONE_PROOF Proof("badge_proof_1") Proof("badge_proof_2"); PUSH_TO_AUTH_ZONE Proof("badge_proof_1"); CALL_METHOD Address("${COMPONENT_ADDRESS}") "method_name" Proof("badge_proof_2"); ## Call a Protected Method/Function URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/call-a-protected-method-function Updated: 2026-02-18 Summary: To call a protected method or function, you must show proof that you can pass some AccessRule. For protected methods, this AccessRule can be any access rule ass Build > Scrypto > Authorization > Call a Protected Method/Function — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/call-a-protected-method-function.md) Call a Protected Method/Function To call a protected method or function, you must show proof that you can pass some AccessRule. For protected methods, this AccessRule can be any access rule assigned to a role which has permission to call that method. For protected functions, this is the directly assigned AccessRule for the given function. Proving this is accomplished by creating Proof objects and pushing these onto one’s AuthZone before calling the protected method/function. Proofs One of the important conventions of badge usage is that, under normal usage, they are not actually withdrawn from a Vault and passed around. Instead, a Proof is created and used to prove that an actor owns that badge - or at least access to it. You can create a Proof for a particular resource from a Vault or (rarely) a Bucket, and then do things with that Proof without changing ownership of the underlying contents. For example, if my account holds a FLIX token signifying that I am a member of Radflix, I can create a Proof of that token and present it to a Radflix component so that it will allow me to access it. I’m not actually transferring it…even if the Radflix component was buggy or malicious it would have no ability to take control of the underlying token from which the Proof was generated. Think of it just like flashing a badge in the real world. Whoever you show it to can see that you possess it, and can inspect it, but you’re not actually handing it to them so they can’t hang on to it. Proofs have a quantity associated with them, and a Proof can not be created with a quantity of 0. That is, if you have a Vault which is configured to hold a specified resource, but that Vault is empty, you can’t create a Proof of that resource. Examples of how to create and use Proofs can be found in Using Proofs (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/using-proofs.md) The Authorization Zone The top level of every transaction, accessible by the transaction manifest (transaction-manifest) , contains a worktop where resources are stored, and an authorization zone where Proofs are stored. When calling any Scrypto method from the manifest, the rules governing access to that method are automatically compared to the contents of the authorization zone. If the rules can be met, access to the method is granted and the call succeeds. If the rules can’t be met, then the transaction immediately aborts. There’s no need to specify what Proofs you think are necessary to meet the rules; the system just figures it out for you. The same logic applies within a component called directly from the manifest. If it attempts a privileged action on a resource, such as trying to mint additional supply, the rules are checked against the contents of the authorization zone. If the rules can be met, the action succeeds. If the rules can’t be met, the transaction aborts. That’s it. In the vast majority of use cases, you don’t have to think about access control or the authorization zone. The system just takes care of it. ## Authorization URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/auth/auth Updated: 2026-02-18 Summary: Authorization is the process of giving a caller the ability to call a method or function. Traditional blockchains often use the caller's address for this purpos Build > Scrypto > Authorization — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/README.md) Authorization is the process of giving a caller the ability to call a method or function. Traditional blockchains often use the caller's address for this purpose (like msg.sender in Solidity) and require the smart contract to implement custom authorization logic. Scrypto, on the other hand, provides a built-in system based on Badges, AccessRules, and Roles. Authorized Call Pattern The basic pattern for calling a protected method or function consists of 3 steps: - Generate a Proof. Proofs are created from buckets or vaults and act as “proof” that you have access to some amount of resources. - Push Proof onto the AuthZone. Every caller has an AuthZone which is used to store proofs for the purpose of Authorization. - Call the Protected Method or Function. Once a call is made a check is made against the method/function’s AccessRule(s) and what is in the caller’s AuthZone. If the two match, the caller will be able to continue with the call. Otherwise, the transaction will fail. Here is an example flow of how this process may look: In this example, the Badge Holding Component holds a resource used for auth (called a “badge”) and produces proofs of the held resource. The caller calls a method on this component to retrieve a proof of this resource. They can then push it onto their AuthZone, after which they may call a protected method. Manage Access to Functions and Methods with AccessRules Managing who can access a function or method is done by assigning an AccessRule to that function or method in Scrypto. An AccessRule is a rule which defines whether or not the proofs contained in an AuthZone are sufficient to pass. A simple example of an AccessRule looks like: let some_resource_address: ResourceAddress = { .. }; let access_rule: AccessRule = rule!(require(some_resource_address)); access_rule in this case is a rule which only passes if the AuthZone contains a proof of some_resource_address. The assignment of AccessRules to functions and methods is slightly different between the two. For functions, an AccessRule is assigned directly to a given function (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-function-accessrules.md) . For methods, a role based access control system is used which statically maps a set of roles to each method (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/structure-roles-and-methods.md) and an AccessRule is assigned to each role (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/assign-component-roles.md) . Useful Auth Design Patterns - User Badge Pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/user-badge-pattern.md) - Actor Virtual Badge Pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/actor-virtual-badge-pattern.md) - The Withdraw Pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/the-withdraw-pattern.md) - Transient Badge Pattern (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/transient-badge-pattern.md) ## Freezing Vaults URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/resources/freezing-vaults Updated: 2026-02-18 Summary: Freezing vaults is only possible for resources where this feature has been enabled, by setting an explicit rule for freezer or freezerupdater roles. Such resour Build > Scrypto > Resources > Freezing Vaults — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/freezing-vaults.md) Freezing vaults is only possible for resources where this feature has been enabled, by setting an explicit rule for freezer or freezer_updater roles. Such resources are clearly flagged in the Radix explorer and wallet, as per the resource behaviors (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) guide. To Freeze or Unfreeze Retrieve a vault address from an API such as the Gateway API, and then you can pass it to the FREEZE_VAULT or UNFREEZE_VAULT manifest commands, assuming the requisite proofs are on the authzone for the freezer role AccessRule for the resource. These commands use a binary flags approach to specify which actions to freeze. Add the flags up determine which to specify on calls to FREEZE/UNFREEZE: - 1 - Withdraws - 2 - Deposits - 4 - Burns You can check out a few examples below: # Freeze Withdraws from a vault FREEZE_VAULT Address("") Tuple(1u32); # Freeze Deposits into a vault FREEZE_VAULT Address("") Tuple(2u32); # Freeze Burns in a vault FREEZE_VAULT Address("") Tuple(4u32); # Freeze Withdraws/Deposits/Burns of a vault FREEZE_VAULT Address("") Tuple(7u32); # Unfreeze Withdraws from a vault UNFREEZE_VAULT Address("") Tuple(1u32); # Unfreeze Deposits into a vault UNFREEZE_VAULT Address("") Tuple(2u32); # Unfreeze Burns in a vault UNFREEZE_VAULT Address("") Tuple(4u32); # Unfreeze Withdraws/Deposits/Burns of a vault UNFREEZE_VAULT Address("") Tuple(7u32); EG a could look like internal_vault_sim1tzmfs5qf8xkkqptce9naaf0f52fn6lgfg8uyr4me5qs5derjcmquc4. ## Recalling Resources URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/resources/recalling-resources Updated: 2026-02-18 Summary: Recalling resources is only possible for resources where this feature has been enabled, by setting an explicit rule for recaller or recallerupdater. Such resour Build > Scrypto > Resources > Recalling Resources — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/recalling-resources.md) Recalling resources is only possible for resources where this feature has been enabled, by setting an explicit rule for recaller or recaller_updater. Such resources are clearly flagged in the Radix explorer and wallet, as per the resource behaviors (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) guide. Recall Transaction Retrieve a vault address from an API such as the Gateway API, and then you can pass it to a recall command. You can recall from the manifest with these commands, assuming you have the requisite proofs on the authzone for the recaller role of the resource. RECALL_FROM_VAULT Address("") Decimal("1") ; RECALL_NON_FUNGIBLES_FROM_VAULT Address("") Array( NonFungibleLocalId("#123#"), NonFungibleLocalId("#456#") ) ; EG a could look like internal_vault_sim1tzmfs5qf8xkkqptce9naaf0f52fn6lgfg8uyr4me5qs5derjcmquc4. You can also recall from a component, by passing the vault address into the component from the manifest, and then doing the following: // Method inside component pub fn recall_fungible_from_internal_vault(&self, vault_address: InternalAddress, amount: Decimal) -> Bucket { self.recaller_badge_vault.authorize(|| { let recalled_bucket: Bucket = scrypto_decode(&ScryptoVmV1Api::object_call_direct( vault_address.as_node_id(), VAULT_RECALL_IDENT, scrypto_args!(Decimal::ONE), )).unwrap(); recalled_bucket }) } There will be an improved Scrypto API for programmatic recall in the future. Note that it is not currently possible to source the vault address from the Radix Engine, so it must be determined from an off-ledger indexer/API, and passed in through a transaction. ## Non-fungible Data URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/resources/non-fungible-data Updated: 2026-02-18 Summary: This topic is still a work in progress Build > Scrypto > Resources > Non-fungible Data — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/non-fungible-data.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Each Non-fungible on the Radix ledger can have its own unique associated data. Unlike metadata, this non-fungible data is unique to the token rather than shared across the resource collection. There are several common fields for Displaying Non-fungible data (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/non-fungible-standards/non-fungible-data-for-wallet-display.md) in the Radix Wallet (https://github.com/gguuttss/radix-docs/blob/master/use/radix-wallet-overview.md) , Dashboard (https://github.com/gguuttss/radix-docs/blob/master/use/radix-dashboard.md) and other explorers, but we can choose whichever fields we want when defining a non-fungible data structure. Non-fungible data structures are defined when creating a non-fungible resource collection. The specific data for each token is then added at minting. It is immutable by default, but we can choose for some or all of it to be updatable. Below you will find how to define, set, retrieve and update non-fungible data. Creating a Resource that has Non-fungible Data To add non-fungible data to resource we first define the data structure. // A top-level struct must derive NonFungibleData. // All types referenced directly/indirectly also need to derive ScryptoSbor. #[derive(ScryptoSbor, NonFungibleData)] struct MyData { name: String, description: String, // Note that marking top-level fields as `#[mutable]` means that the data // under that field can be updated. #[mutable] mutable_field: String, // Add any other custom data fields could go here } With a defined non-fungible data struct we can create the resource. Note where :: is below. #[blueprint] mod example { struct Example { } impl Example { pub fn new() -> Global { let collection_manager = ResourceBuilder::new_ruid_non_fungible::(OwnerRole::None) // --snip-- Adding Non-fungible Data When Minting When minting new non-fungibles we decide the vales for any data they will hold. This is a little different for the different non-fungible types: RUID Type Non-fungibles pub fn mint_non_fungible( &mut self, name: String, description: String, ) -> NonFungibleBucket { // Create non-fungible data let non_fungible_data = MyData { name, description, mutable_field: "original value".to_owned(), }; // Mint a single non-fungible with the data self.collection_manager.mint_ruid_non_fungible(non_fungible_data) } Integer Type Non-fungibles pub fn mint_non_fungible( &mut self, id: u64, name: String, description: String, ) -> NonFungibleBucket { // Create non-fungible data let non_fungible_data = MyData { name, description, mutable_field: "original value".to_owned(), }; // Mint a single non-fungible with the data self.collection_manager .mint_non_fungible(&NonFungibleLocalId::integer(id), non_fungible_data) } String Type Non-fungibles pub fn mint_non_fungible( &mut self, id: String, name: String, description: String, ) -> NonFungibleBucket { // Create non-fungible data let non_fungible_data = MyData { name, description, mutable_field: "original value".to_owned(), }; // Mint a single non-fungible with the data self.collection_manager .mint_non_fungible(&NonFungibleLocalId::string(id).unwrap(), non_fungible_data) } Byte Type Non-fungibles pub fn mint_non_fungible( &mut self, id: [u8; 32], name: String, description: String, ) -> NonFungibleBucket { // Create non-fungible data let non_fungible_data = MyData { name, description, mutable_field: "original value".to_owned(), }; // Mint a single non-fungible with the data self.collection_manager .mint_non_fungible(&NonFungibleLocalId::bytes(id).unwrap(), non_fungible_data) } Retrieving Non-fungible Data Using a Non-fungible Local ID With the resource address and a local ID we can retrieve non-fungible data. The collection_manger holds the resource address in the example below. pub fn get_non_fungible_data_by_id(&self, id: NonFungibleLocalId) -> MyData { self.collection_manager.get_non_fungible_data::(&id) } From Buckets For a bucket containing a single non-fungible pub fn get_non_fungible_id_and_data_from_bucket( &self, bucket: NonFungibleBucket, ) -> (NonFungibleLocalId, MyData) { let non_fungible_id = bucket.non_fungible_local_id(); let non_fungible_data = bucket.non_fungible().data(); (non_fungible_id, non_fungible_data) } For a bucket containing a multiple non-fungibles pub fn get_multiple_non_fungible_ids_and_data_from_bucket( &self, bucket: NonFungibleBucket, ) -> Vec<(NonFungibleLocalId, MyData)> { // Get all non-fungible IDs from the bucket let non_fungible_ids = bucket.non_fungible_local_ids(); // Get all the non-fungible data from the bucket let non_fungible_data: Vec = bucket .non_fungibles() .iter() .map(|non_fungible| non_fungible.data()) .collect(); // For each ID, get the associated data and add to results non_fungible_ids.iter().cloned().zip(non_fungible_data).collect() } From Proofs There are different ways to retrieve non-fungible data from Proofs depending on whether they are sitting on the Auth Zone (../auth/using-proofs.md#verify-your-caller-precisely-with-the-authzone) or have been passed by intent (../auth/using-proofs.md#getting-non-fungible-data-from-proofs) . Have a look at the Authorizing callers of your method section (../auth/using-proofs.md#authorizing-callers-of-your-method) of Using Proofs (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/using-proofs.md) for a complete description of how. Reading Individual Data Fields There is no way to read a single specified non-fungible data field by name yet. However if you know its position in the data struct you can use the following method. pub fn get_non_fungible_data_field( &self, field_index: usize, id: NonFungibleLocalId, ) -> Value { // Get the non-fungible data let structured_data: ScryptoValue = self.collection_manager.call( NON_FUNGIBLE_RESOURCE_MANAGER_GET_NON_FUNGIBLE_IDENT, &NonFungibleResourceManagerGetNonFungibleInput { id: id.clone() }, ); // Unwrap the tuple to get the fields let ScryptoValue::Tuple { fields } = structured_data else { panic!("NF data was not a tuple"); }; // Retrieve then return the field at the given index fields.get(field_index).unwrap().to_owned() } You could use a Gateway API call (https://radix-babylon-gateway-api.redoc.ly/#operation/NonFungibleData) to find the position of your chosen data field. Modifying Non-fungible Data For a non-fungible data field to be mutable we have to mark it as such in it's struct. // All types referenced directly/indirectly also need to derive ScryptoSbor. #[derive(ScryptoSbor, NonFungibleData)] pub struct MyData { name: String, description: String, #[mutable] mutable_field: String, } We can then update it, as long as we have the resource address and our token's local ID. Updating Fields Using Scrypto In most circumstances, to update a non-fungible data field in Scrypto you will need an resource-owner or resource-non-fungible-data-updater badge stored in your component. In the example below that's our owner_badge, which we use to authorize the update. The authorized closure ( method in authorize_with_amount()) can then use the resource address in the form of the collection_manger and the id of our chosen non-fungible to identify and update the mutable_field value. pub fn update_non_fungible_data( &mut self, id: NonFungibleLocalId, new_field_value: String, ) { self.owner_badge.authorize_with_amount(1, || { self.collection_manager.update_non_fungible_data( &id, "mutable_field", new_field_value, ); }); } Updating Fields Using Transaction Manifests We can change updatable non-fungible data fields using Radix transaction maniefsts. In the example below we first authorize the change by adding an owner_badge Proof to the authzone, then we call update_non_fungible_data on the resource manager. CALL_METHOD Address("${account_address}") "lock_fee" Decimal("100") ; CALL_METHOD Address("${account_address}") "create_proof_of_amount" Address("${non_fungible_owner_badge_address}") Decimal("1") ; CALL_METHOD Address("${non_fungible_resource_address}") "update_non_fungible_data" NonFungibleLocalId("${non_fungible_local_id}") "mutable_field" # Field name "Updated Value" # New value ; ## Buckets and Vaults URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/resources/buckets-and-vaults Updated: 2026-02-18 Summary: Because Scrypto is about safe resource handling, we introduce the concept of resource containers. There are two types of resource containers: Buckets and Vaults Build > Scrypto > Resources > Buckets and Vaults — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/buckets-and-vaults.md) Because Scrypto is about safe resource handling, we introduce the concept of resource containers. There are two types of resource containers: Buckets and Vaults. - Bucket - A bucket is a temporary container used to move resources during a transaction; therefore, it only exist in the duration of the transaction. - Vault - A vault is a permanent container and at the end of each transaction, all resources must be stored in a Vault. Buckets are transient resource containers which are used to move resources from one vault to another. As such, buckets are dropped at the end of the transaction and the resource held by the bucket must be stored in a vault or burned. pub fn vault_to_vault(&mut self) { let bucket: Bucket = self.vault.take(dec!("1")); self.other_vault.put(bucket); } Fungible/NonFungible Buckets and vaults may be further categorized to be Fungible or NonFungible adding four more types of resource containers: - FungibleBucket - A bucket containing an amount of some fungible resource. - NonFungibleBucket - A bucket containing an amount of some non-fungible resource. - FungibleVault - A vault containing an amount of some fungible resource. - NonFungibleVault - A vault containing an amount of some non-fungible resource. These more specific types allows one to call more specialized methods for the given resource type: pub fn display_non_fungibles(bucket: Bucket) -> Bucket { let non_fungible_bucket: NonFungibleBucket = bucket.as_non_fungible(); println!("{:?}", non_fungible_bucket.non_fungibles()); non_fungible_bucket.into() } Note that the above will panic if the passed in bucket is actually of fungible type. Buckets Buckets and it’s Fungible/NonFungible variants are the main mechanism through which assets move. Below is a table of methods available to work with buckets. Bucket Methods Method Description .burn() Burns (or destroys) the resource contained within the bucket .create_proof_of_all() Creates a Proof of all the resources in the bucket .resource_address() Returns the address of the resource stored in the bucket .resource_manager() Returns the ResourceManager of the resource contained within the bucket .amount() Returns the amount of tokens stored in the bucket .is_empty() Returns true if the Bucket is empty, false otherwise. .put(bucket) Take a Bucket and put its content into the bucket on which this method is called. .take(amount) Take a quantity of tokens and return a new Bucket containing them. .take_advanced(amount, withdraw_strategy) Take a quantity of tokens with a certain withdraw strategy and return a new Bucket containing them. .drop_empty() If the Bucket is empty, the Bucket is dropped. Otherwise, an error will be returned. .as_fungible() Convert the bucket into a FungibleBucket. Note: This method panics if the underlying resource is not a fungible resource. .as_non_fungible() Convert the bucket into a NonFungibleBucket. Note: This method panics if the underlying resource is not a non fungible resource. .authorize_all(function) Authorizes an action by putting the badges present in the Bucket on the AuthZone before running the specified function. Note: more information about this here. FungibleBucket Methods A FungibleBucket inherits all of the Bucket methods with the following additional methods: Method Description .create_proof_of_amount(amount) Creates a Proof of a certain amount of the resources in the bucket. .authorize_with_amount(amount, function) Authorizes an action by putting a proof of certain amount present in the Bucket on the AuthZone before running the specified function. NonFungibleBucket Methods A NonFungibleBucket inherits all of the Bucket methods with the following additional methods: Method Description .non_fungibles() Returns a vector containing the data of each NFT present in the bucket. .take_non_fungibles(IndexSet) Take multiple non-fungible tokens from the bucket. This returns a new Bucket with the specified NFTs. .non_fungible() Returns the data of the NFT that the bucket contains. Note: this method panics if the bucket does not contain exactly one NFT .take_non_fungible(NonFungibleId) Takes a specific NFT from the bucket and returns a new Bucket that contains it. Note: this method panics if the bucket does not contain exactly one NFT Vaults Buckets are used to store resources while they are moving during a transaction. On the other hand, vaults are used to store resources for longer-term in-between transactions. The distinction between these two containers allows the system to make sure no resources are ever lost if, for example, someone is withdrawing tokens from their account but forgets to insert them into another vault. The most straightforward way to illustrate the concept of vaults is with the account components. Each account component contains a vault for each resource type it owns, as seen in its state definition: struct Account { vaults: KeyValueStore } When an account receives a new token (e.g. through a call to one of its deposit methods), it checks if it already instantiated a vault of this particular resource type. If a vault for the received resource type does not exist, one is created and the received tokens are stored in it. You can visualize the vaults and resources present on an account component (or any component) by using the resim show command as shown below. > resim new-account > resim new-token-fixed --symbol BTC --name Bitcoin 21000000 > resim show [account_address] Component: account_sim1q0esun0yw3y6h4glaea4fds5kzd3j0hayqxpec589m3s0kjz2g Access Rules State: Tuple(KeyValueStore("b825ef8e0e15fe71d85b7e3f4adb4c0c8bb9e9902fdb52ff4d7c73219c7d286403040000")) Key Value Store: AccountComponent[03f30e4de47449abd51fee7b54b614b09b193efd200c1ce2872ee3][...] ├─ ResourceAddress("resource_sim1qzkcyv5dwq3r6kawy6pxpvcythx8rh8ntum6ws62p95sqjjpwr") => Vault("b825ef8e0e15fe71d85b7e3f4adb4c0c8bb9e9902fdb52ff4d7c73219c7d286405040000") └─ ResourceAddress("resource_sim1qrgs0ge3nm2sh6fc09rtfgzcx4jfvk6uk77t9sqdu66qrs6z50") => Vault("93a0a362a5aa37e847e5b9a94343061ee6738f19ad81f5ce04b4558cd6fbd2f705040000") Resources: ├─ { amount: 1000, resource address: resource_sim1qzkcyv5dwq3r6kawy6pxpvcythx8rh8ntum6ws62p95sqjjpwr, name: "Radix", symbol: "XRD" } └─ { amount: 21000000, resource address: resource_sim1qrgs0ge3nm2sh6fc09rtfgzcx4jfvk6uk77t9sqdu66qrs6z50, name: "Bitcoin", symbol: "BTC" } In the Key Value Store section, you can see the mapping between the resource addresses and their corresponding Vault. Resim then displays the resource list in a user-friendly format in the Resources section. Creating Vaults In Scrypto, there are two main ways for creating a Vault: Function Description Vault::new(ResourceAddress) This creates a new empty Vault that will be used to stored resources of the specified ResourceAddress. Vault::with_bucket(Bucket) This is a shortcut for creating a new Vault and inserting a Bucket of tokens inside. The ResourceAddress is inferred from the passed Bucket. Here is a simple example: use scrypto::prelude::*; #[blueprint] mod my_module { struct MyBlueprint { my_vault: Vault } impl MyBlueprint { pub fn instantiate(tokens: Bucket) -> Global { Self { // Create a Vault from the provided bucket // and store in on the component state my_vault: Vault::with_bucket(tokens) }.instantiate().globalize() } } } Just like buckets have to be deposited into a vault by the end of a transaction, Vaults must be stored on a component state by the end of a transaction. Vault Methods Method Description .put(bucket) Deposits a Bucket into this vault .resource_address() Returns the address of the resource stored in the vault .resource_manager() Returns the ResourceManager of the resource contained within the vault .amount() Returns the amount of tokens stored in the vault .is_empty() Returns true if the vault is empty, false otherwise. .take(amount) Returns a Bucket containing the specified quantity of the resources present in the vault .take_non_fungible(NonFungibleId) Returns a Bucket with a NFT of the specified ID taken from the vault .take_all() Returns a Bucket containing all the resources present in the vault .authorize(function) Authorizes an action by putting the badges present in the vault on the AuthZone before running the specified function. FungibleVault Methods A FungibleVault inherits all of the Vault methods with the following additional methods: Method Description .lock_fee(amount) Lock fee to pay for the transaction. Note: this panics if the vault contains a resource different than XRD. .lock_contingent_fee(amount) Lock fee to pay for the transaction, contingent on this transaction committing successfully Note: this panics if the vault contains a resource different than XRD. .create_proof_of_amount(amount) Creates a Proof of a certain amount of resources in the vault .authorize_with_amount(amount) Authorizes an action by putting a proof of certain amount present in the vault on the AuthZone before running the specified function. NonFungibleVault Methods A NonFungibleVault inherits all of the Vault methods with the following additional methods: Method Description .non_fungible_local_ids(limit) Returns a index-set containing the id of NFTs present in the vault. The maximum number of NFTs returned is defined by limit .contains_non_fungible(NonFungibleLocalId) Returns true if the vault contains a given NonFungible, false otherwise .non_fungibles() Returns the ResourceManager of the resource contained within the vault .take_non_fungibles(IndexSet) Takes a set of NFTs from the vault and returns a Bucket with those NFTs. .burn_non_fungibles(IndexSet) Burns a set of NFTs from the vault .non_fungible_local_id() Returns the non fungible local id of the vault Note: this method panics if the bucket does not contain exactly one NFT .non_fungible() Returns the data of the NFT that the vault contains Note: this method panics if the bucket does not contain exactly one NFT .take_non_fungible(NonFungibleId) Takes a specific NFT from the vault and returns a new Bucket that contains it. .create_proof_of_non_fungibles(IndexSet) Creates a Proof of a set of NFTs from the vault .authorize_with_non_fungibles(IndexSet) Authorizes an action by putting a proof of a set of NonFungibles present in the vault on the AuthZone before running the specified function. ## Resource Behaviors URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/resources/resource-behaviors Updated: 2026-02-18 Summary: New resources created with the ResourceBuilder can also be attributed with special characteristics. For example, you can specify behaviors to mark a resource as Build > Scrypto > Resources > Resource Behaviors — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) New resources created with the ResourceBuilder can also be attributed with special characteristics. For example, you can specify behaviors to mark a resource as mintable, burnable, or even restrict its ability to be withdrawn, making them “Soulbound” (https://www.radixdlt.com/blog/asset-oriented-soulbound-tokens-done) tokens. AccessRules can also be specified with resource behaviors, which allows you to determine the conditions need to be present before the behavior action can be performed. We can also determine the mutability of these authorization rules to have the ability to change the authorization rules in the future or lock its mutability to prevent its rules from being changed. Resource Behaviors Defining access rules Full details on the rule!(..) macro in the below is covered in Advanced AccessRules (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/advanced-accessrules.md) . Method Description OwnerRole::None OwnerRole::Fixed(rule!(..)) OwnerRole::Updatable(rule!(..)) The OWNER of a resource is set in the new_.. method of the resource builder and is by default able to update the metadata of the resource, and other rules can be set to forward to the OWNER role, as a useful fallback. It is recommended to set this to some kind of dApp admin badge so that the metadata can be updated in future if required. This may be necessary as part of establishing two-way dApp linking (metadata-for-verification) during bootstrapping. .mint_roles(mint_roles! { minter => rule!(..); // Or => OWNER; minter_updater => rule!(..); // Or => OWNER; }) Specify the AccessRule for MintRoles to provide permission to mint tokens. Defaults: minter => rule!(deny_all) and minter_updater => rule!(deny_all). .burn_roles(burn_roles! { burner => rule!(..); // Or => OWNER; burner_updater => rule!(..); // Or => OWNER; }) Specify the AccessRule for BurnRoles to provide permission to burn tokens. Defaults: burner => rule!(deny_all) and burner_updater => rule!(deny_all). .withdraw_roles(withdraw_roles! { withdrawer => rule!(..); // Or => OWNER; withdrawer_updater => rule!(..); // Or => OWNER; }) Specify the AccessRule for WithdrawRoles to provide permission to withdraw tokens from a Vault. Defaults: withdrawer => rule!(allow_all) and withdrawer_updater => rule!(deny_all). .deposit_roles(deposit_roles! { depositor => rule!(..); // Or => OWNER; depositor_updater => rule!(..); // Or => OWNER; }) Specify the AccessRule for DepositRoles to provide permission to deposit tokens to a Vault. Defaults: depositor => rule!(allow_all) and depositor_updater => rule!(deny_all). .recall_roles(recall_roles! { recaller => rule!(..); // Or => OWNER; recaller_updater => rule!(..); // Or => OWNER; }) Specify the AccessRule for RecallRoles to provide permission to recall tokens. By default recalling is locked as rule!(deny_all). Defaults: recaller => rule!(deny_all) and recaller_updater => rule!(deny_all). .freeze_roles(freeze_roles! { freezer => rule!(..); // Or => OWNER; freezer_updater => rule!(..); // Or => OWNER; }) Specify the AccessRule for FreezeRoles to provide permission to freeze tokens. By default freezing is locked as rule!(deny_all). Defaults: freezer => rule!(deny_all) and freezer_updater => rule!(deny_all). .non_fungible_data_update_roles(non_fungible_data_update_roles! { non_fungible_data_updater => rule!(..); non_fungible_data_updater_updater => rule!(..); }) (Non-fungible only) Specify the AccessRule for NonFungibleDataUpdateRoles to provide permission to update the NfData attached to each individual non-fungible. Defaults: non_fungible_data_updater => rule!(deny_all) and non_fungible_data_updater_updater => rule!(deny_all). .divisibility(number) (Fungible only) The divisibility is a number between 0 and 18 and it represents the number of decimal places that this resource can be split into. For example, if you set the divisibility to 0, people will only be able to send whole amounts of that resource. Default: 18. You can also use the constants DIVISIBILITY_NONE = 0 and DIVISIBILITY_MAXIMUM = 18 .metadata(metadata! { init { "name" => "Super Admin Badge".to_string(), locked; } }) Not strictly about behaviour on ledger, but you will likely wish to configure metadata (scrypto-entity-metadata) as per the Metadata for Wallet Display (metadata-for-wallet-display) and Metadata for Verification (metadata-for-verification) standards so that the resource has a clear identity in wallets and explorers. Mintable To make a resource mintable means that we allow the creation of additional supply of that resource. We can do this by simply adding .mint_roles() when we create our resource and map the AccessRule to each BurnRoles. // Note our resource takes and OwnerRole argument this can be Fixed, Updatable, or None let my_token = ResourceBuilder::new_fungible(OwnerRole::Fixed(rule!(require(access_rule)))) .metadata(metadata!{ init { "name" => "My Token", locked; "symbol" => "TKN", locked; } }) .mint_roles(mint_roles!{ // #1 minter => rule!(allow_all); // #2 minter_updater => rule!(deny_all); // #3 }) .create_with_no_initial_supply(); - To make a resource mintable, you simply have to make a call to the mint_roles() method during the resource creation which requires that we map the AccessRule for two roles. - Here we set the AccessRule for the minter role, allow_all makes minting public. We can of course and often will want to restrict minting to a particular badge here instead with something like minter => rule!(require(badge_address)); instead. - The minter_updater is how we control the mutability of the minter role. deny_all locks the minter role, we can again also pass in a badge address to create an authority which can change the minter role like minter_updater => rule!(require(badge_address)); Burnable Having a resource burnable indicates that an specified supply of this resource can essentially be destroyed. If all the supply of that resource is burnt, the ResourceManager will still exist. Additionally, if that resource is mintable then more of that resource can be created. Similar to our previous example, to make our resource burnable, we call the .burn_roles() method when we create our resource and map the AccessRule to each BurnRoles. // Note our resource takes and OwnerRole argument this can be Fixed, Updatable, or None let my_token = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!{ init { "name" => "My Token", locked; "symbol" => "TKN", locked; } }) .burn_roles(burn_roles!{ burner => rule!(allow_all); // This makes the resource freely burnable. You could also require(admin_badge) to restrict who can burn the token. burner_updater => rule!(deny_all); }) .create_with_no_initial_supply(); Restrict Withdraw Resources restricted from being withdrawn are effectively locked in the Vault that contains it. This makes the resource soulbound and its most common use-case is to attach some form of identification or reputation to the account that owns that resource. // Note our resource takes and OwnerRole argument this can be Fixed, Updatable, or None let my_token = ResourceBuilder::new_fungible(OwnerRole::Fixed(rule!(require(access_rule)))) .metadata(metadata!{ init { "name" => "My Token", locked; "symbol" => "TKN", locked; } }) .withdraw_roles(withdraw_roles!{ withdrawer => rule!(deny_all); withdrawer_updater => rule!(deny_all); }) .create_with_no_initial_supply(); Restrict Deposit Resources restricted from being deposited are commonly called transient resources. This forces a dangling resource to exist. If the resource can’t be deposited into a Vault, the resource must be burnt, else we will encounter a dangling resource error. Transient resources are most commonly used as a means to force a specified condition to happen within a transaction. If that condition is met, we can permit the resource to be burned. Alternatively, if we specify an authorization requirement, we can allow this resource to be deposited if a specified condition is met. // Note our resource takes and OwnerRole argument this can be Fixed, Updatable, or None let my_token = ResourceBuilder::new_fungible(OwnerRole::Fixed(rule!(require(access_rule)))) .metadata(metadata!{ init { "name" => "My Token", locked; "symbol" => "TKN", locked; } }) .deposit_roles(deposit_roles!{ depositor => rule!(deny_all); depositor_updater => rule!(deny_all); }) .create_with_no_initial_supply(); Recallable Token Having a resource to be recallable allows us to send our tokens to anybody, but have the ability for us to retrieve it if we desire. The most common use case for this is to allow for “Rental NFTs” (https://www.radixdlt.com/blog/asset-oriented-rental-nfts-done) . We can create conditions in how long this resource can essentially be borrowed for. // Note our resource takes and OwnerRole argument this can be Fixed, Updatable, or None let my_token = ResourceBuilder::new_fungible(OwnerRole::Fixed(rule!(require(access_rule)))) .metadata(metadata!{ init { "name" => "My Token", locked; "symbol" => "TKN", locked; } }) .recall_roles(recall_roles!{ recaller => rule!(require(admin_badge)); recaller_updater => rule!(deny_all); }) .create_with_no_initial_supply(); Freezable Token When building regulated assets you may need to have the ability to freeze those assets so of course Scrypto has this functionality built in for you to compose for your own use case. // Note our resource takes and OwnerRole argument this can be Fixed, Updatable, or None let freezer_token = ResourceBuilder::new_fungible(OwnerRole::Fixed(rule!(require(access_rule)))) .metadata(metadata! { init { "name" => "My Token", locked; "symbol" => "TKN", locked; } }) .freeze_roles(freeze_roles! { freezer => rule!(require(admin_badge)); freezer_updater => rule!(deny_all); }) .mint_initial_supply(1000) .into(); A Non-fungible with updatable metadata let non_fungible: NonFungibleResourceManager = ResourceBuilder::new_ruid_non_fungible::(OwnerRole::None) .metadata(metadata! { init { "name" => "My NF Resource", locked; } }) .non_fungible_data_update_roles(non_fungible_data_update_roles! { non_fungible_data_updater => rule!(require(admin_badge)); non_fungible_data_updater_updater => rule!(deny_all); }) .create_with_no_initial_supply() .into(); Updating the rules after creating resources Up till now, we have specified all rules with an _updater rule set to rule!(denyall). This means that it can never be changed, ever. Instead of rule!(deny_all) you could provide a custom rule with the usual rule!() macro. The authority you provide here has the ability to update the rule in the future. They can change the rule at will, and at any point they also have the right to change the _updater role’s rule to rule!(denyall), so that it may never again be changed. This means that it will display as fixed in the wallet. Locking is a one-way process…​ there’s no going back to a mutable rule once it has been locked. Here’s an example of playing with some rules around freezing a token: // Initial creation, rule_admin is some badge address we have previously defined let resource_address = ResourceBuilder::new_fungible(OwnerRole::Updatable(rule!(require(access_rule)))) .metadata(metadata!( init { "name" => "Globally freezable token", locked; } )) .withdraw_roles(withdraw_roles! { withdrawer => rule!(allow_all); withdrawer_updater => rule!(require(rule_admin)); }) .deposit_roles(deposit_roles! { depositor => rule!(allow_all); depositor_updater => rule!(require(rule_admin)); }) .create_with_no_initial_supply(); ... // Later in the code // `rule_admin_vault` is a vault that contains the badge allowed to make the following changes. self.rule_admin_vault.authorize(|| { // Freeze the token, so no one may withdraw or deposit it let resource_manager = ResourceManager::from_address(resource_address); resource_manager.set_depositable(AccessRule::DenyAll); resource_manager.set_withdrawable(AccessRule::DenyAll); // ...or, make it so only a person presenting the proper badge can withdraw or deposit resource_manager.set_depositable(rule!(require(transfer_badge))); resource_manager.set_withdrawable(rule!(require(transfer_badge))); // Unfreeze the token! resource_manager.set_depositable(AccessRule::AllowAll); resource_manager.set_withdrawable(AccessRule::AllowAll); // Lock the token in the unfrozen state, so it may never again be changed resource_manager.lock_depositable(); resource_manager.lock_withdrawable(); }); The authorize() method In the previous example, you saw the use of the authorize() method on the rule_admin_vault. This method is available on both Vaults and Buckets and is used to temporarily put a proof of the underlying resources on the authzone for authorization. The method takes as parameter a closure (https://doc.rust-lang.org/rust-by-example/fn/closures.html) with no argument and runs it after putting the proofs on the auth zone. After running the closure, the proofs are removed from the auth zone. Default Rules All roles have defaults they are set to when unspecified or set to None. They are: Roles Role Rule Updater Role Rule mint_roles deny_all deny_all burn_roles deny_all deny_all freeze_roles deny_all deny_all recall_roles deny_all deny_all withdraw_roles allow_all deny_all deposit_roles allow_all deny_all non_fungible_data_update_roles deny_all deny_all - For more info on the ResourceManager setting and updating methods check the Rust docs here (https://docs.rs/scrypto/latest/scrypto/resource/resource_manager/struct.ResourceManager.html) - For additional methods you may also want to check out the ResourceManagerStub Rust docs here (https://docs.rs/scrypto/latest/scrypto/resource/resource_manager/struct.ResourceManagerStub.html) ## Resource Creation in Detail URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/resources/resource-creation-in-detail Updated: 2026-02-18 Summary: The ResourceBuilder is your utility for creating new resources. You have a number of resource creation methods to make things easy. Build > Scrypto > Resources > Resource Creation in Detail — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-creation-in-detail.md) The ResourceBuilder The ResourceBuilder is your utility for creating new resources. You have a number of resource creation methods to make things easy. You can start using it like this ResourceBuilder::new_ then complete the building process using either create_with_no_initial_supply() or mint_initial_supply(..). What’s returned It’s important to note that create_with_no_initial_supply() will return a ResourceManager where as mint_initial_supply(..) will return a bucket with the created supply of resources. New Resource Methods Method Arguments Returns new_fungible() owner_role : OwnerRole InProgressResourceBuilder new_string_non_fungible() - owner_role : OwnerRole - InProgressResourceBuilder> new_integer_non_fungible() - owner_role : OwnerRole - InProgressResourceBuilder> new_bytes_non_fungible() - owner_role : OwnerRole - InProgressResourceBuilder> new_ruid_non_fungible() - owner_role : OwnerRole - InProgressResourceBuilder> Read the Resource Behaviors (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) article for further customization options on the builder, including: - Setting the OwnerRole - Setting custom behaviours - Setting divisibility for fungible resources - Setting metadata Here is an example creating a very simple fungible resource: let my_token: FungibleBucket = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "My Token", locked; "symbol" => "TKN", locked; } )) .divisibility(DIVISIBILITY_NONE) // No decimals allowed .mint_initial_supply(100); The Non-Fungible variants also require a struct of NonFungibleData as you can see below: // The top-level struct must derive NonFungibleData. // Note that marking top-level fields as `#[mutable]` means that the data under // that field can be updated by the `resource_manager.update_non_fungible_data(...)` method. // // All types referenced directly/indirectly also need to derive ScryptoSbor. // To work with the Manifest Builder, we recommend all types also derive ManifestSbor. #[derive(ScryptoSbor, ManifestSbor, NonFungibleData)] struct GameData { team_one: String, team_two: String, section: String, seat_number: u16, #[mutable] promo: Option, } #[blueprint] mod nftblueprint { struct NftBlueprint { } impl NftBlueprint { pub fn create_game_nfts() { ResourceBuilder::new_integer_non_fungible::(OwnerRole::None) .metadata(metadata! { init { "name" => "Mavs vs Lakers - 12/25/2023", locked; "description" => "Tickets to the 2023 season of the Dallas Mavericks", locked; } ) .create_with_no_initial_supply(); } } } The ResourceManager When a resource is created on the Radix network, a “Resource Manager” with a unique address will be associated with it. This manager contains the data of this resource such as its type (fungible or non-fungible), its metadata and its supply among other things. It also allows people with the right authority to do actions on the resource like minting more tokens and updating its metadata. On this page, we will show you how to use the ResourceManager type that Scrypto offers and the various methods you can call on it. Methods available on ResourceManager The ResourceManager offers many methods that we listed in the following tables. Fetching resource information Method Description get_metadata(key) Returns the metadata associated with the specified key. resource_type() Returns a ResourceType that is either non-fungible or fungible. total_supply() Returns the total supply of the resource. non_fungible_exists(NonFungibleLocalId) Returns whether a token with the specified non-fungible id was minted. get_non_fungible_data(NonFungibleLocalId) Returns the data associated with a particular non-fungible token of this resource. Applying actions Provided that the correct authority is presented (more information about this here (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) ), you can apply the following actions to a resource. Method Description set_metadata(key, value) Updates a metadata key associated with this resource. mint(amount) Mints an amount of fungible tokens. Note: The resource must be of the fungible type. mint_non_fungible(&id, data) Mints a non-fungible token with the specified id and data. Note: The resource must be of the non-fungible type update_non_fungible_data(&id, field_name, new_data) Updates a single field of the non-fungible data associated with a minted non-fungible token of the specified ID. Updating resource flags More information on access rules and resource flags here (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) . Method Description set_mintable(access_rule) Sets the access rule for being able to update the mintable flag. lock_mintable() Locks the mintable flag. set_burnable(access_rule) Sets the access rule for being able to update the burnable flag. lock_burnable() Locks the burnable flag. set_withdrawable(access_rule) Sets the access rule for being able to update the withdrawable flag. lock_withdrawable() Locks the withdrawable flag. set_depositable(access_rule) Sets the access rule for being able to update the depositable flag. lock_depositable() Locks the depositable flag. set_recallable(access_rule) Sets the access rule for being able to update the recallable flag. lock_recallable() Locks the recallable flag. set_updatable_metadata(access_rule) Sets the access rule for being able to update the updateable_metadata flag. lock_updatable_metadata() Locks the updateable_metadata flag. set_updatable_non_fungible_data(access_rule) Sets the access rule for being able to update the updateable_non_fungible_data flag. lock_updatable_non_fungible_data() Locks the updateable_non_fungible_data flag. The owner role can be updated with set_owner_role and lock_owner_role. Related Rust docs - ResourceBuilder (https://docs.rs/scrypto/latest/scrypto/resource/resource_builder/struct.ResourceBuilder.html) - ResourceManager (https://docs.rs/scrypto/latest/scrypto/resource/resource_manager/struct.ResourceManager.html) - ResourceManagerStub (https://docs.rs/scrypto/latest/scrypto/resource/resource_manager/struct.ResourceManagerStub.html) ## Resources URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/resources/resources Updated: 2026-02-18 Summary: Resources (sometimes referred to as tokens or assets) are special and are a crucial part of how Scrypto makes financial applications and transactions safer and Build > Scrypto > Resources — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/README.md) Resources (sometimes referred to as tokens or assets) are special and are a crucial part of how Scrypto makes financial applications and transactions safer and more predictable. Resources are native to the Radix Engine, meaning the engine knows how resources are created and behaves, therefore enforces resource behaviors. For example, when resources are created they can only be moved from owner to owner, never copied or unintentionally destroyed or lost. Resources must always be stored in resource containers – either a Vault or a Bucket – and Radix Engine enforces that no resource can ever be lost. In short, Radix Engine ensures that resources behave like "physical things" which is why they are used on Radix for all types of assets. Even the utility token of the Radix network, XRD, is a resource. Types of Resources Scrypto offers two types of resources that developers can easily build: Fungible and NonFungible resources. Fungible Resources A quantity of a fungible resources can be freely split into smaller quantities, and smaller quantities can be recombined. Typical tokens (including the XRD token), where no two tokens have an individual identity, are created as fungible resources. Example uses of fungible resources (but not limited to) include: - Utility or governance tokens - Stablecoins - Fractionalized shares - Liquidity provider tokens - Tokenized representations of commodities NonFungible Resources Non-fungible resources, each individual resource unit is uniquely addressable and not divisible. You can think of a non-fungible resource as a grouped set of individual tokens which have the same behavior, but where each unit is a standalone token with its own identity (and its own unique associated data). Here are some example use cases of non-fungible resources: - Tickets to an event which have individual seat numbers - Representations of unique numbered documents or products - Deeds of ownership of property or other real assets - Transactable unique debt positions or derivatives Resource Behaviors Resources have configurable behavior (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) which is intrinsically understood by the Radix engine and clearly communicated to consumers, such as wallets. A developer is able to specify rules around things like who is able to mint more supply (if anyone), whether it requires special rights to deposit or withdraw it, and so forth. It is also possible to specify which of these rules can be changed after creation, and who is able to change those rules. Please see the Resource Creation (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-creation-in-detail.md) and Resource Behaviors (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) documentation for an explanation of how to define these rules, and examples of their usage. Resource Containers Resources on Radix need to always be placed in some kind of resource containers. Resource containers, as the name suggest, hold resources. Each resource container can only hold one type of resource and the purpose of these resource containers are to properly move, secure, and account for the resources that are being transacted. There are two primary types of resource containers: Bucket and Vault, with specific types for NonFungibleBucket, FungibleBucket, NonFungibleVault, and FungibleVault accordingly. - Bucket - Buckets are temporary containers and are used to move resources within a transaction; therefore, buckets can only exist in the duration of the transaction. - Vault - Vaults are permanent containers where resources must live. Therefore, every transaction which use buckets to move resources must be transferred to a Vault by the end of a transaction. We will go over in detail in a later section Buckets and Vaults (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/buckets-and-vaults.md) and you can also see more details of the Bucket and Vault implementations in the Scrypto Rust docs: - scrypto::blueprints::resource::Bucket (https://docs.rs/scrypto/latest/scrypto/blueprints/resource/struct.Bucket.html) - scrypto::blueprints::resource::Vault (https://docs.rs/scrypto/latest/scrypto/blueprints/resource/struct.Vault.html) Scrypto Utilities Scrypto offers a handful of utilities to conveniently create and manage resources. - ResourceBuilder - The ResourceBuilder is used to create fungible and non-fungible resources. - ResourceManager - When a resource is created, a ResourceManager is also created to manage and define resource behaviors. We will go into more detail about the ResourceBuilder and ResourceManager in the next section Resource Creation in Detail (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-creation-in-detail.md) . ## Blueprints and Components URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/blueprints-and-components Updated: 2026-02-18 Summary: Scrypto splits the concept of "smart contract" into two parts: blueprints and components. Build > Scrypto > Blueprints and Components — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/blueprints-and-components.md) Scrypto splits the concept of "smart contract" into two parts: blueprints and components. Blueprints define the logic and the type of data your component holds. On the other hand, components are live instantiation of your blueprint. Therefore, components are more akin to how a smart contract are conventionally understood where users on the network can interact with. Blueprints in Scrypto are similar to classes in object-oriented programming. Each blueprint contains declarations of state structure, functions, and methods a component can have. This way, blueprints can be considered templates for components. A function on the blueprint is expected to perform the instantiation of the blueprint into an active component, typically including configuration parameters for that instance. As templates, blueprints contain no internal state and can hold no resources. As a result, multiple components can be instantiated from the same blueprint. For example, if we have blueprint which defines the structure and logic of a liquidity pool, we can instantiate multiple components which facilitate token swaps across many different token pairs. While each component may facilitate different token pairs, all components instantiated from the same blueprint behave the same. Scrypto code starts its lifecycle as a blueprint package. Each package contain one or more blueprints and are deployed to the network. Once deployed, components can be instantiated from blueprints where users can interact with. A simple example of a blueprint can look like this: use scrypto::prelude::*; #[blueprint] mod hello { struct Hello { // Define what resources and data will be managed by Hello components sample_vault: Vault, } impl Hello { // Implement the functions and methods which will manage those resources and data // This is a function, and can be called directly on the blueprint once deployed pub fn instantiate_hello() -> Global { // Create a new token called "HelloToken," with a fixed supply of 1000, and put that supply into a bucket let my_bucket: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "Hello Token", locked; "symbol" => "HT", locked; } )) .mint_initial_supply(1000); // Instantiate a Hello component, populating its vault with our supply of 1000 HelloToken Self { sample_vault: Vault::with_bucket(my_bucket), } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } // This is a method, because it needs a reference to self. Methods can only be called on components pub fn free_token(&mut self) -> Bucket { info!( "My balance is: {} HelloToken. Now giving away a token!", self.sample_vault.amount() ); // If the semi-colon is omitted on the last line, the last value seen is automatically returned // In this case, a bucket containing 1 HelloToken is returned self.sample_vault.take(1) } } } The use Declaration The very first part of a blueprint is the use declaration to import symbols from the Scrypto standard library and/or other libraries. It creates local name bindings with items defined in an external path. The example scrypto::prelude::* is the list of things that are commonly used in Scrypto. Importing Scrypto prelude saves you from manually importing everything that you need individually. The #[blueprint] Macro The #[blueprint] macro allows you to define a blueprint using items defined in Rust grammar. It groups a pair of struct and impl within a mod. A mod is used to define a module in Rust, and it is a way to organize everything that encapsulates a blueprint in Scrypto. The mod is followed by a name, which can be anything so long as it is a snake_case identifier. The struct is all the fields that each component instantiated from will have. It is followed by the name you would like to call your blueprint. Blueprint names unlike mod use a CamelCase identifier and are used to instantiate components. In this example, our blueprint name is Hello. The fields within a struct can be of any data type supported by SBOR, including but not limited to integers, strings, tuples, vectors and maps. Additionally, the struct field can also define the resources the component will have and/or interact with. Finally, within the impl (implementation) block, is where all the functions and methods can be defined. It is also followed the blueprint name specified in the struct. The .instantiate() Method The .instantiate() method transforms Rust value within the blueprint into a component. From there, a component may now be globalized, meaning that the component is addressable and can be interacted with by any user on the network. Otherwise, components not globalized can be owned by other components and only callable by components which own them. ## resim (Radix Engine Simulator) URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/tools-for-scrypto/resim-radix-engine-simulator Updated: 2026-02-18 Summary: Build fast, reward everyone, and scale without friction Build > Scrypto > Tools for Scrypto > resim (Radix Engine Simulator) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/resim-radix-engine-simulator.md) resim resim is your entry point command to interact with the Radix Engine Simulator for local development purposes. With resim you can create accounts, publish packages, run transactions and inspect the local ledger that the simulator creates for this purpose. Below you will find examples of some common tasks you may use resim for in the development process. To get started simply open a new terminal and run resim -h this should give you the following output with a list of available subcommands to begin with. resim 1.0.0 Build fast, reward everyone, and scale without friction USAGE: resim OPTIONS: -h, --help Print help information -V, --version Print version information SUBCOMMANDS: call-function Call a function call-method Call a method export-package-definition Export the definition of a package generate-key-pair Generate a key pair help Print this message or the help of the given subcommand(s) mint Mint resource new-account Create an account new-badge-fixed Create a fungible badge with fixed supply new-badge-mutable Create a fungible badge with mutable supply new-simple-badge Create a non-fungible badge with fixed supply new-token-fixed Create a fungible token with fixed supply new-token-mutable Create a fungible token with mutable supply publish Publish a package reset Reset this simulator run Compiles, signs and runs a transaction manifest set-current-epoch Set the current epoch set-current-time Set the current time set-default-account Set default account show Show an entity in the ledger state show-configs Show simulator configurations show-ledger Show entries in the ledger state transfer Transfer resource to another account resim Cheat Sheet Command Action resim reset Resets your local ledger data resim new-account Creates a new account and sets to default account if one is not already set. ie you just ran resim reset so the ledger is empty resim show
To show info about an address resim show-ledger To list all entities in simulator resim set-default-account To change the default account resim generate-key-pair To get a new key-pair, without creating an account component resim new-token-fixed To create a token with fixed supply resim new-token-mutable To create a token with mutable supply resim new-badge-fixed To create a badge with fixed supply resim new-badge-mutable To create a badge with mutable supply resim new-simple-badge To create a simple NFT badge resim mint To mint resource resim transfer : To transfer resource resim publish To publish a package resim call-function To call a function resim call-method To call a method resim export-package-definition [OPTIONS] Export the definition of a package the output file resim run To run a transaction manifest file resim set-current-epoch To set the current epoch resim set-current-time To set the current time (UTC date time in ISO-8601 format, up to second precision, such as 2011-12-03t10:15:30Z). resim publish --package-address To overwrite a deployed package Using resim The following section demonstrates some common usage examples of the Radix Engine Simulator or resim Creating Accounts Accounts on the Radix network are Scrypto components that hold resource containers and define rules for accessing them. You can instantiate a new account component in the simulator with the resim new-account command. This will give you back the created account’s ComponentAddress, public key, private key, and an Owner Badge. If this is the first account that is instantiated, it will be configured as the "default account" - from which transactions you execute will be signed. resim new-account `Example Output` A new account has been created! Account component address: account_sim1q0l5cmngyap5443zz4qmfylds4tze93rjmpjwt6ev3fq6jyvdh // #1 Public key: 035f5abaa76e02fdbf313063ac02f7b3e7178dea5a1f2373e69b066d9799280b80 // #2 Private key: dfb5804b5c9d9d4bbde3432b7cd674093355439f1695494f612ee3a75ae165ac // #3 Owner Badge: resource_sim1qz5p306qw4zs7nc8vf0zrtl3dasv63g8m4kwvk5kx0vqluxmjf:#1 // #4 Account configuration in complete. Will use the above account as default. - Your account ComponentAddress will be important to specify where resources will be withdrawn from or deposited to. - Your public key of course is your cryptographic identifier. - Your private key will be used to sign transactions. With resim, this is done automatically. So there are only a handful of situations where you may need to handle public/private keys. - The Owner Badge is a resource used to establish ownership and administrative controls over a package. With resim you may not often need to deal with the Owner Badge, but it is good to be aware of this as you climb up the learning curve of Radix development stack. Changing Default Account At some point when you want to test your application, you may want to simulate different scenarios to test your application against. For example, you may want to have the perspective of a buyer or a perspective of a seller if you were building some sort of marketplace. It may be useful to have multiple accounts to run transactions through different accounts. To do so, you will want to change the default account by running the resim set-default-account. Using this command along with its input requirement will look something like this: resim set-default-acount The account ComponentAddress, private_key, and Owner Badge NonFungibleGLobalId belong to the account you wish to change the default to. So if we create a second account by running resim new-account again, it will produce another set of account information: resim new-account `Example Output` A new account has been created! Account component address: account_sim1q3nq6a5t8hx8znrwkf3r870g8lc2d3364276gsge5drstdwley Public key: 03bb20507c6f081330f616564186a28068c2c909712f561f20551943a4e0263a67 Private key: 6e472ca0351d883b3583a272e6a9da840a13962606c1971fc280bc0cd45b8601 Owner badge: resource_sim1qfxvtzdvls3u476wxj8t4hfg2temmm463gtzc0szv2estyq2mn:#1# Now having a second account, we can set the default account to the new account we just created by running resim set-default-account with the argument inputs as so: resim set-default-account account_sim1q3nq6a5t8hx8znrwkf3r870g8lc2d3364276gsge5drstdwley 6e472ca0351d883b3583a272e6a9da840a13962606c1971fc280bc0cd45b8601 resource_sim1qz5p306qw4zs7nc8vf0zrtl3dasv63g8m4kwvk5kx0vqluxmjf:#1# And like that, the simulator has now configured the default-account to a different account At any time, you can find the current default account with the command resim show-configs Sending Tokens You can send tokens from the default account to another one by running the command: resim transfer [OPTIONS] Which takes two parameters: the resource to send (as a RESOURCE_SPECIFIER string) and the recipient’s address (which was returned from the new-account command). The RESOURCE_SPECIFIER syntax is discussed further below, but as an example, it looks like this, for fungible and non-fungible resources respectively: # Transfer 100 XRD from the default account to resim transfer resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3:100 # Transfer the #12#, #900#, #181# non-fungibles under the given resource address to resim transfer resource_sim1ngktvyeenvvqetnqwysevcx5fyvl6hqe36y3rkhdfdn6uzvt5366ha:#12#,#900#,#181# If everything worked correctly, you should see a success message along with the receipt of the transaction (which we will explain in more detail in another chapter). Showing Account Balance Let’s verify the balance of the second account. You can query the state of an address with the resim show
command. Let’s try to run this command with the address of the second account: resim show This will output the data present at that address. You should see, in the resources section, that this account has 1100 XRD which shows that the previous command worked perfectly. You can use the show command with any address. Let’s try with the XRD resource address we used earlier: resim show resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 `Example Output` Resource Type: Fungible { divisibility: 18 } Metadata: 4 ├─ description: The Radix Public Network's native token, used to pay the network's required transaction fees and to secure the network through staking to its validator nodes. ├─ url: https://tokens.radixdlt.com ├─ symbol: XRD └─ name: Radix Total Supply: 1000000000000 You can see that it outputs useful information about the token: its name, description, maximum supply among other things. Inputting Arguments The important thing to note if you are new to Scrypto is that Scrypto deals heavily with types, yet when operating with resim, the inputs are represented as strings which are then parsed behind the scenes. Therefore, there are certain types, particularly Bucket and Proof which need to be passed in specific ways. With most argument inputs such as addresses, Decimal, String, etc. you will only need to input them plain and simply like so: resim show package_sim1p4r4955skdjq9swg8s5jguvcjvyj7tsxct87a9z6sw76cdfd2jg3zk resim show account_sim1cyvgx33089ukm2pl97pv4max0x40ruvfy4lt60yvya744cve475w0q resim show component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh resim show resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 Deploying a Blueprint Deploying a Blueprint will require resim call-function command. The basic argument inputs look like this: resim call-function The Blueprint name is case-sensitive. Passing Bucket Arguments using RESOURCE_SPECIFIER As mentioned, most inputs we need to pass as arguments are simply the value itself as string representation. However, there are special arguments that a function or a method may require. Notably these are buckets and proofs. For these we use a representation called the resim RESOURCE_SPECIFIER. For example, a GumballMachine component may require you to pass a Bucket of XRD tokens to purchase the gumball resource. To pass a Bucket of fungible resource, the input will look like so: : or :. So, if we were to pass a Bucket of 1 XRD to buy 1 gumball, it would look like this: resim call-method buy_gumball resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3:1 Currently, resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 is the resim known ResourceAddress for XRD. You can refer to Well-Known Addresses (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/well-known-addresses.md) page to keep up to date. Non-fungible resources follow a similar format as fungible resources, but instead we must also pass its NonFungibleLocalId. The format of the string representation of non-fungible RESOURCE_SPECIFIER is: : Specifying additional non-fungible units will be separated by a comma like this: :, , …​, Non-fungible ID format When referencing non-fungible IDs, include their surrounding braces. e.g. an integer non-fungible ID might be #23# including the # #, or RUID non-fungible could have the ID {ebf66729b1a12bf8-dd3dfcfda7021600-7dcb9a2540fe42c1-b1a094a879852b19} including the { }. Without these resim commands will fail. As an example, say that resource_sim1qqw9095s39kq2vxnzymaecvtpywpkughkcltw4pzd4pse7dvr0 is a non-fungible resource which has a non-fungible id type of NonFungibleIdType::Integer, if we wish to specify non-fungible tokens of this resource with the ids: 12, 900, 181, the string representation of the non-fungible resource specifier would be: resource_sim1qqw9095s39kq2vxnzymaecvtpywpkughkcltw4pzd4pse7dvr0:#12#,#900#,#181# Passing Proofs as Arguments resim allows us to pass proofs in our argument inputs. To do so, we must attach a --proofs flag after we input our arguments with the proof(s) we may want to pass. As an example, let’s imagine a component with a permissioned method call mint_admin_badge, we can pass a Proof for it like so: resim call-method mint_admin_badge 1 --proofs : We can also pass multiple proofs by separating each Proof with a comma. resim call-method mint_admin_badge 1 --proofs :, : If we need proofs of non-fungibles we put their non-fungible ID rather than the amount resim call-method mint_admin_badge 1 --proofs :, : Other Input Types resim only supports a handful of types. For other types such as HashMap, Vector, Enum, etc. you will need to write your own transaction manifest. Visit the specifications page to view examples of how you may pass other types with a transaction manifest. Outputting and Running Transaction Manifest Files When we are running commands with resim, under the hood, resim is actually generating and submitting transaction manifest files for transactions. This is because every transaction on Radix contains a transaction manifest with a list of instruction intents. To output transaction manifest files, we will simply need to use the --manifest flag along with the path to which we want to save the transaction manifest file along with its name. To use the GumballMachine example previously, outputting a transaction manifest file via resim looks like this: resim call-method buy_gumball resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3:1 --manifest .rtm Once the transaction manifest appears in the directory of your choice you may open the .rtm file and customize the instructions you want to make in your transaction. When satisfied, you may want to submit the transaction manifest file instead of submitting instructions via resim. To do so we will need to use the resim run command which will look like so with its input requirements: resim run .rtm ## Scrypto Builder URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/tools-for-scrypto/scrypto-builder Updated: 2026-02-18 Summary: DOCKERDEFAULTPLATFORM=linux/amd64 docker pull radixdlt/scrypto-builder:v1.2.0 Build > Scrypto > Tools for Scrypto > Scrypto Builder — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/scrypto-builder.md) scrypto-builder is a tool for compiling Scrypto projects in a deterministic way. It allows third-parties to verify that package WASM and RPD files are indeed compiled from a specific source code. Usage - Pull the scrypto-builder Docker image # You can find all releases on https://github.com/radixdlt/radixdlt-scrypto/releases DOCKER_DEFAULT_PLATFORM=linux/amd64 docker pull radixdlt/scrypto-builder:v1.2.0 - Compile Scrypto project DOCKER_DEFAULT_PLATFORM=linux/amd64 docker run -v /full/path/to/scrypto/crate:/src radixdlt/scrypto-builder:v1.2.0 ## Scrypto CLI Tool URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/tools-for-scrypto/scrypto-cli-tool Updated: 2026-02-18 Summary: The scrypto tool is a Scrypto-specific convenience wrapper for Rust’s package manager, Cargo. Build > Scrypto > Tools for Scrypto > Scrypto CLI Tool — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/scrypto-cli-tool.md) Scrypto CLI The scrypto tool is a Scrypto-specific convenience wrapper for Rust’s package manager, Cargo. scrypto 1.0.0 Create, build and test Scrypto code USAGE: scrypto OPTIONS: -h, --help Print help information -V, --version Print version information SUBCOMMANDS: build Build a Scrypto package fmt Format a Scrypto package help Print this message or the help of the given subcommand(s) new-package Create a Scrypto package test Run Scrypto tests Cheat Sheet Command Action scrypto new-package To create a new package scrypto build To build a package scrypto test To test a package scrypto test - -- --nocapture To run a test with standard output ## Tools for Scrypto URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/tools-for-scrypto/tools-for-scrypto Updated: 2026-02-18 Summary: Legacy documentation: Tools for Scrypto Build > Scrypto > Tools for Scrypto — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/README.md) ## Scrypto URL: https://radix.wiki/developers/legacy-docs/build/scrypto-1/scrypto-1 Updated: 2026-02-18 Summary: Legacy documentation: Scrypto Build > Scrypto — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/README.md) ## Useful Links URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/useful-links Updated: 2026-02-18 Summary: Legacy documentation: Useful Links Build > dApp Development > Useful Links — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/useful-links.md) Rust Crate Docs Scrypto Rust Crate (https://docs.rs/scrypto/latest/scrypto/index.html) Scrypto & Radix Engine Repository (https://github.com/radixdlt/radixdlt-scrypto) Radix API Docs Gateway API: ReDocly Documentation (https://radix-babylon-gateway-api.redoc.ly/) Core API: ReDocly Documentation (https://radix-babylon-core-api.redoc.ly/) System API: ReDocly Documentation (https://radix-babylon-system-api.redoc.ly/) NPM Packages Radix dApp Toolkit (https://www.npmjs.com/package/@radixdlt/radix-dapp-toolkit) ROLA - Radix Off Ledger Authentication (https://www.npmjs.com/package/@radixdlt/rola) TS Radix Engine Toolkit (https://www.npmjs.com/package/@radixdlt/radix-engine-toolkit) Gateway API SDK (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) Core API SDK (https://www.npmjs.com/package/@radixdlt/babylon-core-api-sdk) Wallet and Connector Extension Radix Wallet Connector (chrome) (https://chrome.google.com/webstore/detail/radix-wallet-connector/bfeplaecgkoeckiidkgkmlllfbaeplgm) Radix Wallet (/contents/tech/core-protocols/radix-wallet) Dashboard, Dev Console, Gumball Club, etc. Stokenet Dashboard (https://stokenet-dashboard.radixdlt.com/network-staking) Stokenet Dev Console (https://stokenet-console.radixdlt.com/) Stokenet Gumball Club (https://gumball-club-dev.rdx-works-main.extratools.works/) Examples on GitHub Official Examples (https://github.com/radixdlt/official-examples) ## Productionize Your Code URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/before-you-release/productionize-your-code Updated: 2026-02-18 Summary: All packages, components, and resources of an application should have an Owner role configured. The Owner role will be extremely important when new features are Build > dApp Development > Before You Release! > Productionize Your Code — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/productionize-your-code.md) Ensure That All Entities Have an Owner All packages, components, and resources of an application should have an Owner role configured. The Owner role will be extremely important when new features are added to Scrypto and the Radix Engine. As an example, when the upgradeability feature is released it will be the Owner role that can upgrade components and packages. This is not limited to just upgradeability, some new features added to packages, components, and resources will require the Owner role. Additionally, the owner role is used as a default for some of the role assignments such as that of the metadata setter. The following are some tips for the owner role: - If a single badge is used as the owner role then it is important that this badge is accessible and that proofs can be generated from this badge and used in transactions. Thus, this badge should be stored in an account (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) or an access controller (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/access-controller.md) . Storing this badge in a component with no way of creating proofs of the badge makes it essentially unusable for arbitrary transactions. - Blueprint instantiation functions should take an OwnerRole as an argument and apply it recursively to everything that they create. As an example, the owner role passed should be the owner role of all created resources and instantiated components. There are three variants of the OwnerRole: - OwnerRole::None: This specifies that there is no owner role. - OwnerRole::Fixed: This specifies an owner AccessRule and that it is fixed and not updatable. This is optimal for applications where it is guaranteed that there is no need for the owner rule to be updated. - OwnerRole::Updateable: This specifies an owner AccessRule and that it is not fixed and can be updated by the owner itself. This is optimal for applications that wish to have a way of changing the owner AccessRule later on. This could allow applications to start with a simple owner rule and then make it more complex as the need arises. Example The following is an example of a pair blueprint in a decentralized exchange. This example shows an instantiation function which takes an OwnerRole and correctly propagates this owner role to all of the resources it creates and components that it instantiates. The result is a set of entities that all belong to the same owner. use scrypto::prelude::*; #[blueprint] mod pair { pub struct Pair { pub pool: Global, pub pool_manager: FungibleVault, } impl Pair { pub fn instantiate( owner_role: OwnerRole, resource_addresses: (ResourceAddress, ResourceAddress), ) -> (Global, FungibleBucket) { // ✅ The owner role is propagated to all of the resources created // in the instantiation function that should belong to the same // owner. let pool_manager_badge = ResourceBuilder::new_fungible(owner_role.clone()) .divisibility(DIVISIBILITY_NONE) .mint_initial_supply(1); let admin_badge = ResourceBuilder::new_fungible(owner_role.clone()) .divisibility(DIVISIBILITY_NONE) .mint_initial_supply(1); // ✅ The owner role is propagated to all of the components created // in the instantiation function. let pool = Blueprint::::instantiate( owner_role.clone(), rule!(require(pool_manager_badge.resource_address())), resource_addresses, None, ); // ✅ The owner role is used as the owner of the Pair being // instantiated. let pair = Self { pool, pool_manager: FungibleVault::with_bucket(pool_manager_badge), } .instantiate() .prepare_to_globalize(owner_role) .globalize(); (pair, admin_badge) } } } Ensure That The Metadata of Entities is Correctly Configured Adherence to The Metadata Standards The Radix wallet and dashboard show various information to the user based on the metadata of packages, components, and resources. Currently, this is information such as the name, description, icon, and so on. However, it is not restricted to just that; the metadata is also used for two-way linking and could be used to construct a tree of related dApps and entities. Failure to set up the metadata correctly could result in two-way linking not being established correctly, the application not looking as it should in transactions, and various entities not looking as they should on the Radix dashboard and wallet. Additionally, misconfigured two-way links between the dApp definition and the dApp website could even result in transactions being rejected by the Radix wallet (if it is not operating in developer mode). Thus, it is important for application developers to adhere to the metadata standards used by the Radix wallet and dashboard and to ensure that all entities within an application adhere to them. The primary standards and guides to follow are: - Metadata for Wallet Display (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) - Metadata for Verification (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) - dApp Definition Setup (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-definition-setup.md) Updatable Metadata The metadata standard used by the wallet will continue to change and evolve as the Radix wallet evolves and as developers’ needs arise. Thus, the metadata of the various entities of an application must be updatable to allow application developers to adapt to such changes in the metadata standards. Failure to make the metadata updatable would mean that the application resources, components, and packages might not look right in the Radix wallet and dashboard. Ensure That Entities and Transactions Look Good in The Wallet And Dashboard Do The On-Ledger Entities Look Correct In The Dashboard? - Do all of the on-ledger entities have correctly configured two-way linking? Do The Resources Look Correct In The Wallet and Dashboard? - The Radix wallet and dashboard show the resource behavior (such as whether it can be minted, burned, etc…). Does the resource behavior seen there align with how this resource is expected to behave? Does it align with the rules it was configured with when it was created in the application blueprint or manifest? - Is the metadata displayed for the resource on the Radix wallet and dashboard as expected? Is the name, symbol, description, tags, icon URL, and related dApps showing up correctly? Do The Manifests Look Correct In The Wallet? - Does the wallet show the application manifests as being conforming or non-conforming? Conforming manifests are shown in the user-friendly user interface that shows the assets withdrawal and deposits, presented badges, as well as related dApps whereas non-conforming manifests are displayed as raw manifest strings. If the manifests are shown as non-conforming then they should adhere to the rules defined in the Conforming Transaction Manifest Types (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/conforming-manifest-types.md) document. Not all of the transaction types in that document are currently supported, a small number are. However, it serves as guidance for developers on how to write manifests that are recognized by the current and future versions of the wallet. The rules surrounding manifest types will continue to improve and the detection will get better. Eventually, transaction type detection will be in a place where users would very seldom encounter non-conforming manifests and most would refuse to sign non-conforming manifests. Thus, application developers should aim to make their manifests conform to the wallet rules to be displayed well to the users. - Does the dApp information show up in the “using dApps” section of the transaction preview screen? If it does not then this could be because of a misconfigured dApp definition. Application developers should aim to have their dApps displayed correctly in the “using dApps” of the transaction preview screen as future versions of the wallet will warn the users when a new dApp is encountered there and when a dApp seems to be impersonating the name. - Are deposits of a known amount shown as deposits of a guaranteed amount or does the wallet prompt the user to enter in guarantees? If it is the latter despite the deposit being of a known amount that can’t change at manifest runtime (e.g., transferring X resources between accounts A and B) then this might be due to the use of instructions like TAKE_ALL_FROM_WORKTOP or the use of Expression("ENTIRE_WORKTOP"). As a general rule of thumb, always use instructions with explicit amounts and IDs unless your application does not allow for it. This includes instructions such as TAKE_FROM_WORKTOP, TAKE_NON_FUNGIBLES_FROM_WORKTOP, and methods such as try_deposit_or_abort and try_deposit_or_refund. Do Non-Fungible Tokens Look Correct In The Wallet? - Is all of the data expected to be shown to users shown to them in the Radix wallet? The wallet does not show all of the non-fungible data, it only shows fields of simple data types like strings, numbers, booleans, and so on but can not show more complex fields like arrays of addresses. Application developers should make sure that the fields they expect users would want to see are displayed in the wallet. Note Display of non-fungible data is not currently implemented in the wallet. Publish a Package Built Through the Deterministic Builder The scrypto build command does not produce deterministic builds. This means that building the same package on different operating systems and processor architectures would not produce identical WASM files. If it is desirable to be able to verify published packages against their source code then the deterministic Scrypto builder should be used, more information can be found here (https://docs-babylon.radixdlt.com/main/scrypto/release_notes/migrating_from_0.12_to_1.0.html#_scrypto_builder_docker_image) . The version of the builder used should be the same as that of the Scrypto dependency. Meaning, version X.Y.Z of the builder should be used if the package's Scrypto dependency is as follows: [dependencies] scrypto = { git = "https://github.com/radixdlt/radixdlt-scrypto", tag = "vX.Y.Z" } To build a package through the deterministic builder run the following commands: $ DOCKER_DEFAULT_PLATFORM=linux/amd64 docker pull radixdlt/scrypto-builder:v1.0.1 $ DOCKER_DEFAULT_PLATFORM=linux/amd64 docker run -v :/src radixdlt/scrypto-builder:v1.0.1 Note v1.0.1 in the above commands should be replaced with the version of the Scrypto dependency of the package being built. Each release of Scrypto has an associated version of the deterministic builder. Optimize The Size of Packages for Lower Fees There are various ways of optimizing the fees for the users of an application. Perhaps the biggest and simplest of which is making the package WASM smaller. This makes the package cheaper to load into the memory of the Radix Engine and cheaper to invoke methods and functions on. There are many different ways to optimize the package size and some are easier than others. The following is a list of low-hanging fruits for size (thus fee) optimization. Note This section does not mention anything about the use of wasm-opt since it is now integrated into scrypto build as well as the deterministic Scrypto builder. It is no longer a separate step or command. This is available in v0.12.0 and higher of the Scrypto toolchain. To check which version of the Scrypto toolchain is installed run scrypto --version. Additionally, this section does not mention the use of the --release flag since scrypto build only produces release builds. Register Key-Value Store and Non-Fungible Data Types Without the use of registered types, WASM packages need to have the ability to generate the SBOR schema of the types used in key-value stores and non-fungible data during runtime. However, the machinery for deriving SBOR schemas is quite heavy and increases the size of WASM modules by a non-trivial amount by adding a lot of code to them. Thus, pushing the requirement of schema generation from runtime to compile-time removes the need for the WASM modules to have the machinery for deriving SBOR schemas and reduces the size of the WASM modules. The registration of types allows for their schemas to be generated at compile-time and included in the Radix Package Definition (.rpd) file instead of being generated at runtime. Example The following example shows two blueprints: one of them does not use registered types and another that uses registered types. The WASM produced from the earlier is likely to be larger than that of the former. use scrypto::prelude::*; #[derive(ScryptoSbor, NonFungibleData)] pub struct Card { pub name: String, } // ⛔️ Incorrectly Configured mod misconfigured { use super::*; #[blueprint] mod blueprint { struct Blueprint { key_value_store: KeyValueStore, } impl Blueprint { pub fn instantiate() -> Global { // ⛔️ Creates a non-fungible resource without using registered // types. The schema for this will be derived at run-time which // adds the schema derivation machinery into the WASM. let _non_fungible_resource = ResourceBuilder::new_ruid_non_fungible::( OwnerRole::None, ) .create_with_no_initial_supply(); // ⛔️ Creates a key-value store without using registered types. // The schema for this will be derived at run-time which adds // the schema derivation machinery into the WASM. let key_value_store = KeyValueStore::::new(); Self { key_value_store } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } } } } // ✅ Correctly Configured mod correctly_configured { use super::*; #[blueprint] // ✅ The types used in the KeyValueStore and the non-fungible resource are // both registered. #[types(Card, u32)] mod blueprint { struct Blueprint { key_value_store: KeyValueStore, } impl Blueprint { pub fn instantiate() -> Global { // ✅ Creates a non-fungible resource with a registered type. let _non_fungible_resource = ResourceBuilder::new_ruid_non_fungible_with_registered_type::(OwnerRole::None) .create_with_no_initial_supply(); // ✅ Creates a KVStore with registered types. let key_value_store = KeyValueStore::::new_with_registered_type(); Self { key_value_store } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() } } } } Unit type To add the unit type (()) to the #[types()] attribute, define it with type Unit = (). e.g. type Unit = () #[blueprint] #[types(Card, u32, Unit)] Use a Release Profile Optimized for Smaller Size The release profile used for builds can have a large effect on the size of the produced WASM. The following release profile has been shown to provide consistently good results (especially when combined with wasm-opt): [profile.release] opt-level = 'z' lto = true codegen-units = 1 panic = 'abort' strip = true overflow-checks = true ## Code Hardening URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/before-you-release/code-hardening Updated: 2026-02-18 Summary: All methods and functions that take Proofs by intent must check the resource address of the passed proof before using any of the data on the Proof. If such a ch Build > dApp Development > Before You Release! > Code Hardening — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/code-hardening.md) Check The Resource Address of All Proofs Passed by Intent All methods and functions that take Proofs by intent must check the resource address of the passed proof before using any of the data on the Proof. If such a check is not done then the application risks being deceived by a Proof passed by an attacker which has the data, non-fungible ID, amount, and other information that the application expects but of a different resource. Example The following is an example of this bug in action. The blueprint shown below is a claim handler blueprint. Each user is given a badge that gives them the right to claim their resources. The component stores the claims in a KeyValueStore which maps the non-fungible local ID of the claimer to a vault containing their resources. When a claimer requests to claim their resources they pass a Proof of their user badge and the component returns everything in their claim vault back to them. In the misconfigured example shown below, the claim method does not check for the resource address of the passed Proof, it just gets the non-fungible local ID, fetches the associated Vault, and then returns all of the resources in the vault. In this case, an attacker can create a resource, mint NFTs of this resource of the non-fungible IDs that they wish to request claims for and be able to successfully perform the claims as the resource address of the Proof is not checked. Mitigating this attack is quite simple and made easy by the fact that Proofs do not expose any information about their contents until they are converted into CheckedProofs through methods like check, check_with_message, and skip_checking. In the correctly configured example shown below the check method is called on the Proof to first validate that it is a user badge, only then does the claim process continue. use scrypto::prelude::*; // ⛔️ Misconfigured mod misconfigured { use super::*; #[blueprint] mod claim_handler { /// Stores the resource claims for users and returns the resources when /// the user presents their badge. struct ClaimHandler { /// Maps the non-fungible local ID of the user's claim badge to the /// vault containing the resources that they can claim. claims: KeyValueStore, } impl ClaimHandler { pub fn claim(&mut self, proof: Proof) -> Bucket { // ⛔️ The resource address of the passed proof was not checked. // Thus, it may not have been an actual user badge, it could // very well have been a proof of a non-fungible that an // attacker created and they can mint of it the IDs they wish to // claim assets that do not belong to them. let non_fungible_local_id = proof .skip_checking() .as_non_fungible() .non_fungible_local_id(); self.claims .get_mut(&non_fungible_local_id) .unwrap() .take_all() } } } } // ✅ Correctly Configured mod correctly_configured { use super::*; #[blueprint] mod claim_handler { /// Stores the resource claims for users and returns the resources when /// the user presents their badge. struct ClaimHandler { /// Maps the non-fungible local ID of the user's claim badge to the /// vault containing the resources that they can claim. claims: KeyValueStore, /// The resource manager of the user badge that's minted for users /// to allow them to claim their resources later. This is stored for /// two reasons: /// 1. To mint additional user badges for future claims. /// 2. To check the passed badge against when the claim method is /// called. user_badge: ResourceManager, } impl ClaimHandler { pub fn claim(&mut self, proof: Proof) -> Bucket { let non_fungible_local_id = proof // ✅ This checks that the passed proof is of the user badge // resource address; thus mitigating any chance for an // attacker to provide a proof of a matching local ID but // of a different resource. .check(self.user_badge.address()) .as_non_fungible() .non_fungible_local_id(); self.claims .get_mut(&non_fungible_local_id) .unwrap() .take_all() } } } } Consider The Implications of Proof Amounts In certain cases, special attention needs to be paid to the use of Proof amounts, especially in the context of applications that rely on such amounts for the casting of votes. Multiple proofs can be created of the same underlying assets. Example The following is an example showing an application that is using Proof amounts in a vulnerable way. This is a Decentralized Autonomous Organization (DAO) blueprint where participants in the DAO could cast votes for different proposals. Any entity holding the DAO resources is considered to be a participant and allowed to cast votes of a maximum weight equal to the amount of DAO resources they possess. In the misconfigured example shown below, the vote method takes a Proof of the DAO resources, uses the proof amount to determine the voting weight, and then performs whatever logic is needed to cast a vote of the given weight to the proposal. There are some issues with this: - Voters can vote multiple times since the DAO tokens were not taken away from them. - Voters can create a proof and call the vote method multiple times to inflate their voting power. The attack method shown in the example below is an example of how an attacker might go about exploiting such a vulnerability. Using a single legitimate proof, an attacker can increase their voting weight by simply calling the vote method of the DAO with clones of the legitimate proof as many times as they want. Therefore, applications similar to the DAO shown in this example can not rely on proof amounts alone; they must also factor in a mechanism to disallow double voting. The simplest and most straightforward mechanism to do that borrows some ideas from the design of XRD and Liquid Stake Units (LSUs): When casting a vote users must pass in a bucket of their DAO resources which is used to determine the vote weight, the DAO keeps the DAO resources and returns to the user some kind of replacement or claim resources that they can use at any point of time to re-cast their vote or claim their DAO resources after the voting period has concluded. Snippets of that are shown in the well-configured example below. use scrypto::prelude::*; #[derive(ScryptoSbor)] pub enum Vote { Yes, No, Abstain, } // 💀 Attacker mod attacker { use super::*; #[blueprint] mod attacker { struct Attacker; impl Attacker { /// 💀 This function shows how an attacker might exploit this. Using /// 1 legitimate proof, they're able to multiply their voting power /// by 100 by cloning the proof multiple times. pub fn attack(legitimate_proof: Proof) { let dao: Global = todo!(); let proof = legitimate_proof.skip_checking(); for _ in 0..100 { dao.vote(proof.clone().0, 1, Vote::Yes); } } } } } // ⛔️ Misconfigured mod misconfigured { use super::*; #[blueprint] mod dao { /// A DAO where participants can vote on proposals. struct Dao { /// The address of the resource used for the voting. This is the /// resource users give to the component in order to vote and this /// is what determines their voting weight. voting_resource: ResourceManager, } impl Dao { /// ⛔️ This method uses proofs for voting which allows an attacker /// to inflate a vote in two main ways: /// /// 1. Clone the proof and call this method again. /// 2. Since they still own the resources, they can just create /// another proof from that bucket or vault and pass it again. /// /// By passing proof of the same resources multiple times they can /// inflate the voting results by casting duplicate votes. pub fn vote( &mut self, proof: Proof, _proposal_id: u64, _vote: Vote, ) { let _vote_weight = proof.check(self.voting_resource.address()).amount(); info!("Congratulations, you've voted!"); /* Remaining of the voting logic goes here. */ } } } } // ✅ Correctly Configured mod correctly_configured { use super::*; #[blueprint] mod dao { /// A DAO where participants can vote on proposals. struct Dao { /// The address of the resource used for the voting. This is the /// resource users give to the component in order to vote and this /// is what determines their voting weight. voting_resource: ResourceManager, /// The address of the resource given to the voters after they vote /// and after their voting resources are taken away from them. This /// resource can be exchanged for the voting resource when the vote /// period ends. voting_claim_resource: ResourceManager, } impl Dao { /// This fixes the vulnerability seen in the misconfigured [`vote`] /// by taking a bucket of the vote resources instead of a proof. /// /// This method does the following: /// /// * Ensures that the correct bucket of resources was passed in. /// * Calculates the voting weight. /// * Deposits the voting resources into a vault. /// * Mint an equivalent amount of voting claim resources and return /// them to the caller. /// /// [`vote`]: [`super::misconfigured::dao::Dao::vote`] pub fn vote( &mut self, bucket: Bucket, _proposal_id: u64, _vote: Vote, ) { assert_eq!( bucket.resource_address(), self.voting_resource.address() ); let _vote_weight = bucket.amount(); info!("Congratulations, you've voted!"); /* Remaining of the voting logic goes here. */ } } } } Ensure That No Blueprints Suffer From State Explosion Blueprints that use eagerly-loaded, growable data structures of unbound size such as Vec, HashMap, BTreeMap, IndexMap, HashSet, BTreeSet, IndexSet and other structures should consider switching to lazily-loaded alternatives such as KeyValueStore. Otherwise, if such growable, eagerly-loaded data structures are used in the component state and continue to grow then invoking methods on the component may become very expensive or outright impossible due to the cost unit limit. In the worst case, this can make the fees high (or impossible to pay) for users of an application that's vulnerable to state explosion. In the worst-case scenario, this can result in the assets of the users of an application being locked with no way to unlock them due to the inability to invoke methods on the component. When a component uses a growable eagerly-loaded data structure such as HashMap then all entries in this structure are loaded when the component is called during component state reads. If the application code keeps adding entries, then more and more data will be read with each method invocation on the component. Reading state is costed; thus, the size of such structures can become large enough that invoking methods on the component becomes either too expensive or outright impossible due to the cost unit limit. This can be mitigated by using growable lazily-loaded data structures such as KeyValueStore where none of the entries are read when the component state is read, only the NodeId of the KeyValueStore is read. State explosion is not something to worry about in the following cases: - When non-growable eagerly-loaded data structures such as [T; N] are used. - When growable eagerly-loaded data structures are used but guaranteed by the application code to never grow. As an example, an IndexSet used in component state which is passed as an argument during component instantiation and guaranteed by the application code to never grow. Lazy sets You can build a lazy set from a KeyValueStore by using an empty/unit value () as follows: type LazyMap = KeyValueStore; type LazySet = KeyValueStore; Support for iteration Natively, KeyValueStore does not support iteration on-ledger. It will soon be possible to iterate over its content off-ledger, through the Gateway or upcoming State APIs. If you wish to support on-ledger iteration, you will need to make use of an iterable abstraction on top of KeyValueStore. The community has created some abstractions you can use in the components - see the disclaimer and list below. Please beware the following caveats: - The transaction is subject to a limit in number of substates read, and number of substates written to. If the store grows, you should only plan to read from a small (hopefully bounded) part of the whole store in a single transaction. - Note that there is typically some overhead (in terms substate read limit and fees) to build / enable iteration on-ledger. Community Libraries This list is not vetted Inclusion in this list does not imply endorsement. Please look at the individual project / code for more details, to see if it fits your needs. If you know of a relevant library and would like it included here, please get in touch on Discord or at hello@radixdlt.com (mailto:hello@radixdlt.com) . - Ociswap’s AVL Tree (https://github.com/ociswap/scrypto-avltree) ( audit report (https://hacken.io/audits/ociswap/) ) State explosion example Example The following is an example showing a blueprint vulnerable to state explosion. This blueprint is of a decentralized exchange where the liquidity is stored in a mapping of the pair resource addresses to the pair vaults. In the misconfigured example an IndexMap is used. As more pairs are created, this map can eventually become large enough that it becomes very expensive to load. This makes it so that swaps are very expensive to do as each call to the swap method reads the component state which has this very large map. In the worst-case scenario, this map could become large enough that reading it exceeds the cost unit limit, making it impossible to make any calls to the exchange component, essentially locking the component and all of the assets contained within. The correctly configured example mitigates this by switching the IndexMap with a KeyValueStore. use scrypto::prelude::*; // ⛔️ Misconfigured mod misconfigured { use super::*; #[blueprint] mod decentralized_exchange { /// A decentralized exchange allowing users to swap resources as well as /// contribute liquidity. struct DecentralizedExchange { /// ⛔️ This maps the address of the two resources of the pair to /// their respective vaults. There are two main problems with this: /// /// 1. This map's size is not bounded and can keep growing /// indefinitely /// 2. The map is not lazy loaded; its loaded in full with each /// method call to the component. /// /// This means that it is possible for this map to become large /// enough that any call to this component becomes either very /// expensive or would outright exceed the cost unit limit and /// lead such transactions to fail. In the context of this example /// if that happens then users would be unable to swap tokens and /// liquidity providers would be unable to redeem their pool units /// which would effectively result in loss of assets! pairs: IndexMap<(ResourceAddress, ResourceAddress), (Vault, Vault)>, } impl DecentralizedExchange { /* Implementation of the decentralized exchange */ } } } // ✅ Correctly Configured mod correctly_configured { use super::*; #[blueprint] mod decentralized_exchange { /// A decentralized exchange allowing users to swap resources as well as /// contribute liquidity. struct DecentralizedExchange { /// ✅ Key value stores are lazily loaded instead of being eagerly /// loaded. This means that none of the [`KeyValueStore`] entries /// are loaded when the component state is read, only the [`NodeId`] /// of the [`KeyValueStore`] is loaded. Thus, [`KeyValueStore`] can /// grow in size indefinitely without leading to state explosion /// problems. pairs: KeyValueStore< (ResourceAddress, ResourceAddress), (Vault, Vault), >, } impl DecentralizedExchange { /* Implementation of the decentralized exchange */ } } } Pay Special Attention to Decimal Operations Decimal Overflows Overflows happen when a number can not be represented through the numeric types' internal representation, for example, Decimal::MAX + Decimal::ONE overflows. The arithmetic operators for Decimal and PreciseDecimal panic when an overflow occurs which results in a transaction failure. While an overflow just causes a panic in the package WASM (which is caught by the Radix Engine) when the arithmetic operators are used, it is still undesirable for the following reasons: - The overflow could happen for valid but unexpected inputs. Thus, users could have been under the assumption that the blueprint should continue to function as usual with those unanticipated inputs. Depending on the application this might be a small user inconvenience, partial or complete locking of assets, or anything in between. - Certain overflows are completely avoidable by structuring calculations in different ways. - Financial applications are expected to handle overflows and underflows correctly. There are several important questions that blueprint developers need to think about with each calculation involving Decimals and PreciseDecimals: - Can this calculation overflow? - Can this calculation overflow easily? - Can this calculation be expressed in a way that makes it harder for it to overflow? - Can this calculation be expressed in a way that makes it impossible for it to overflow? - If an overflow occurs, how should the blueprint handle it? Should a default value be used in the case of an overflow? Is the input that led to the overflow an invalid input and thus it should result in a panic? - Would an overflow in this particular calculation constitute a security risk for the blueprint? Would it perhaps lead to loss or locking of assets in the component? The answers to the above questions vary from one application to another and from one calculation to another. It is important for blueprint developers to closely examine each calculation in the context of the questions above. Overflows can be combated in two main ways: by using checked-math APIs which forces blueprint developers to think about overflows with each operation, and by always doing operations that make the number smaller first before operations that make it bigger. Firstly, Scrypto offers APIs for checked arithmetic (similar to the Rust standard library) that attempt to perform an operation and return Option::::Some if it succeeds and Option::::None if it fails. This is different from the built-in arithmetic operators which panic if the operation fails. Developers must use these APIs instead of the regular arithmetic APIs as they push developers to think about overflows and to make a conscious decision about how they wish to handle an overflow: by unwrapping, picking a default, performing other calculations, or completely different means. As an example, developers should prefer using .checked_add instead of using the + operator which allows them to get an Option back. Secondly, calculations can be expressed in a way that is mathematically equivalent but which makes overflows harder or impossible. As an example, do divisions first before multiplications to make sure that the number does not become too large and overflow. Example The following is an example of a mathematical equation that can be coded in two different ways: one where it is impossible to overflow and another where it can easily overflow. This equation calculates how much of resource an amount of pool units corresponds to. The two ways of representing the above equation in code are as follows: let pool_unit_amount: Decimal = todo!(); let pool_unit_total_supply: Decimal = todo!(); let reserves: Decimal = todo!(); /// ⛔️ Easily Overflows! let redemption_amount = pool_unit_amount * reserves / pool_unit_total_supply; /// ✅ Impossible to Overflow! let redemption_amount = pool_unit_amount / pool_unit_total_supply * reserves; In the above example, both ways of calculating the redemption amount are mathematically equivalent. However, the first way can overflow quite easily whereas it is impossible for the second way to overflow. Multiplying pool_unit_amount by reserves can overflow if both numbers are large. However, restructuring the calculation so that the pool_unit_amount is first divided by pool_unit_total_supply and then multiplied with the reserves results in a calculation where it is not only hard to overflow but actually impossible. This is because . Loss of Precision The previous section recommended doing divisions first before multiplication as a way of making overflows more difficult to get. However, this results in a loss of precision and could also result in one part of the equation going to zero. It is recommended that all calculations are done through PreciseDecimal and then converted to Decimal at the end. Incompatible Divisibility With take Methods Calling Vault::take or Bucket::take with a number of decimal places incompatible with the divisibility of the resource results in an error and a transaction failure. This commonly happens in applications that perform some kind of arithmetic and then a Vault::take or Bucket::take of the result. The impact that this has varies from application to application, in a decentralized exchange it might mean that swapping tokens does not work in certain cases. Developers should make sure that their blueprints correctly handle resources of various divisibilities and any possible arithmetic around them with no issues. This can be mitigated by one of the following approaches: - Rounding the amount either up or down to the nearest divisibility decimal places. - Using the Vault::take_advanced and Bucket::take_advanced methods which take a rounding mode and will handle the rounding to the nearest divisibility decimal places with the specified rounding mode. Example The following is an example of code that is vulnerable to this issue and how it can be mitigated through take_advanced: // Divisibility = 2 let vault: Vault = todo!(); let pool_unit_amount: Decimal = dec!(1); let pool_unit_total_supply: Decimal = dec!(3); let reserves: Decimal = dec!(10); // redemption_amount = 3.333333333333333333 let redemption_amount = pool_unit_amount / pool_unit_total_supply * reserves; // ⛔️ A withdraw of 3.333333333333333333 is invalid for a resource with a // divisibility of two. Any decimal with more than 2 decimal places would lead // this to fail. let bucket = vault.take(redemption_amount); // ✅ Correctly handles the divisibility by rounding towards zero. let bucket = vault.take_advanced( redemption_amount, WithdrawStrategy::Rounded(RoundingMode::ToZero) ); Consider Upgradeability Note This section applies until upgradeability is added as a feature of the Radix Engine. Blueprint upgradeability is not currently supported natively in the Radix Engine. Thus, blueprint developers should consider upgradeability and perhaps implement application-level upgradeability for applications where it makes sense. Most importantly, application developers should have ways of migrating from old code to new code if critical bugs are discovered in their blueprints before upgradeability is implemented natively in the engine. The following is a set of questions that blueprint developers should think about and determine the answer to in the context of their applications: - How important is it to have a mechanism to upgrade the behavior? - How should a component be upgraded if bugs are found? - What happens between the period a bug is discovered and a fix is rolled out? What if the discovered bug can drain user funds? Should there be ways of shutting down the component operations? - How should the latest version of a component be discovered? - How to ensure that external packages and components do not break if the component address changes as part of upgrading? - How can the assets be moved to the new version? - How can the state be moved to the new version? - How can older versions of the code be decommissioned? - Can the interface of a new version of the component change? It is important for any non-trivial application to have an answer to the above questions and to have factored application-level upgradeability in mind to have a mechanism for fixing bugs if they arise. There are many different architectures and ways of implementing application-level upgradeability. What makes sense for one application might not make sense for another. An architecture that might make sense for many applications is shown in the diagram below and discussed afterward: This architecture separates component state completely from component logic. The entirety of the state and vaults is stored in one component while another logic component operates over the state. Only one logic component can read or write state at any point in time which is established by requiring a badge for each state read or write. This is the mechanism by which old versions of the code are deprecated and decommissioned; they lose access to the badge and thus lose access to their ability to read and write state and service method calls. Similarly, new versions of the logic are given authority to read and write state by having the badge. A proxy component can be used to ensure that the component address of the application remains the same throughout versions. When the logic of the application is updated the proxy component's state is updated with the component address of the new logic component. Thus, all method calls to the proxy component now forward the method invocations to the new version of the code. Ensure That Protected Methods and Functions Are Actually Protected Misconfigured roles could allow attackers to call methods or perform operations that they should not have access to. Before releasing, the method to role mapping and role to rule mapping should both be checked to ensure that no protected methods are left accidentally public. Example The following is an example showing a blueprint whose method auth is misconfigured in a few different ways. The final blueprint in the example shows a properly configured blueprint. This is a payment splitter blueprint that allows deposits to be split among a group of entities according to percentages defined in the component. Payment splitters have admins that can change the percentage share of the participants. Only the payment splitter manager is allowed to mint additional admin badges by calling the mint_admin_badge on payment splitter components. It is important in this example that this method is only callable by the payment splitter manager. Otherwise, anybody can mint an admin badge and assume admin powers. In the first misconfigured example the enable_method_auth! macro is not used; thus, method auth is completely disabled, and all methods with visibility of pub in the blueprint are public and callable by all. As a result, mint_admin_badge can be called by anyone. The second misconfigured example has enabled method auth and defined that the blueprint has a manager role. However, the mint_admin_badge method is not assigned to the manager role, it is incorrectly assigned to the PUBLIC role making it callable by all. In the third misconfigured example, method auth is correctly enabled, a manager role is correctly defined, and the mapping of the mint_admin_badge method to the manager role is correctly done, however, the manager role is assigned a rule of rule!(allow_all) which makes mint_admin_badge callable by all. Finally, the last blueprint is correctly configured. Note Exhaustiveness is enforced by the rules! and enable_method_auth! macros. Thus, there is no chance of not assigning a method to a role or not assigning a role to a rule, that is a hard-compile-time error. use scrypto::prelude::*; // ⛔️ Misconfigured mod misconfigured1 { use super::*; #[blueprint] mod payment_splitter { /// Allows for deposited resources to be split across multiple different /// parties according to the percentages defined by the component. struct PaymentSplitter; impl PaymentSplitter { /// ⛔️ Method auth is not enabled. Thus, this functions is public /// and callable by all! pub fn mint_admin_badge(&mut self) -> Bucket { todo!() } } } } // ⛔️ Misconfigured mod misconfigured2 { use super::*; #[blueprint] mod payment_splitter { enable_method_auth! { roles { manager => updatable_by: []; }, methods { // ⛔️ The method is mapped to the public role. Thus, it can be // called by all! mint_admin_badge => PUBLIC; } } /// Allows for deposited resources to be split across multiple different /// parties according to the percentages defined by the component. struct PaymentSplitter; impl PaymentSplitter { pub fn mint_admin_badge(&mut self) -> Bucket { todo!() } } } } // ⛔️ Misconfigured mod misconfigured3 { use super::*; #[blueprint] mod payment_splitter { enable_method_auth! { roles { manager => updatable_by: []; }, methods { mint_admin_badge => restrict_to: [manager]; } } /// Allows for deposited resources to be split across multiple different /// parties according to the percentages defined by the component. struct PaymentSplitter; impl PaymentSplitter { pub fn instantiate() -> Global { Self.instantiate() .prepare_to_globalize(OwnerRole::None) // ⛔️ The `mint_admin_badge` method requires the `manager` // role but this role is defined as being AllowAll. Thus, // this method is callable by all! .roles(roles! { manager => rule!(allow_all); }) .globalize() } pub fn mint_admin_badge(&mut self) -> Bucket { todo!() } } } } // ✅ Correctly Configured mod correctly_configured { use super::*; #[blueprint] mod payment_splitter { enable_method_auth! { roles { manager => updatable_by: []; }, methods { // Only the manager role can call the mint_admin_badge method. mint_admin_badge => restrict_to: [manager]; } } /// Allows for deposited resources to be split across multiple different /// parties according to the percentages defined by the component. struct PaymentSplitter; impl PaymentSplitter { pub fn instantiate( manager_rule: AccessRule, ) -> Global { Self.instantiate() .prepare_to_globalize(OwnerRole::None) // The manager role is set to the rule passed to the // constructor function, this should not just be an // allow_all. .roles(roles! { manager => manager_rule; }) .globalize() } pub fn mint_admin_badge(&mut self) -> Bucket { todo!() } } } } Note Similar care needs to be given to function auth as well. ## Before You Release! URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/before-you-release/before-you-release Updated: 2026-02-18 Summary: The goal of this set of articles is to introduce developers to some of the common bugs, misconfigurations, and oversights they may encounter during the developm Build > dApp Development > Before You Release! — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/README.md) The goal of this set of articles is to introduce developers to some of the common bugs, misconfigurations, and oversights they may encounter during the development process and provide examples of exploits that may arise from this as well as mitigation techniques. As the title of the article suggests, developers should go through this article before releasing their applications, but this is also a good article to read before starting to build an application as it highlights several areas that may require special consideration during the development process. This article includes topics related to blueprint security, blueprint behavior, low-hanging fruits for fee optimizations, and user experience in the wallet. The following are the articles available under this section: - Code Hardening (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/code-hardening.md) - Productionize Your Code (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/productionize-your-code.md) ## Non-Fungible Resource Creation URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-transactions/transaction-examples/transaction-non-fungible-resource-creation Updated: 2026-02-18 Summary: It is possible to create non-fungible resources with transactions, but it is quite hard to create them manually - this is because a non-fungible resource includ Build > dApp Development > dApp Transactions > Examples > Non-Fungible Resource Creation — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/transaction-examples/transaction-non-fungible-resource-creation.md) It is possible to create non-fungible resources with transactions, but it is quite hard to create them manually - this is because a non-fungible resource includes a Scrypto SBOR Schema (sbor-schemas) for the non-fungible data. Instead, it’s better to build the transaction with the Rust Manifest Builder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) , and then submit it using the developer console (developer-console) . Building the Transfer Manifest with the Rust Manifest Builder First, set-up a Rust/Scrypto project as in the Rust Manifest Builder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) docs. The below test can be run to output a manifest to ./transaction_manifest/create_non_fungible.rtm with the CREATE_NON_FUNGIBLE_RESOURCE_WITH_INITIAL_SUPPLY command. You can tweak lots of the parameters, including renaming/restructuring MyNonFungibleDataType. This code uses .lock_fee_from_faucet() to lock the standard test fee from the system faucet in Stokenet. Learn more about Lock Fees here (../../../../integrate/rust-libraries/rust-manifest-builder.md#:~:text=Lock%20Fee,-lock_fee) . use scrypto::prelude::*; use scrypto_test::{ prelude::*, utils::dump_manifest_to_file_system }; // The top level struct must derive NonFungibleData. // Note that marking top-level fields as `#[mutable]` means that the data under // that field can be updated by the `resource_manager.update_non_fungible_data(...)` method. // // All types referenced directly/indirectly also need to derive ScryptoSbor. // To work with the Manifest Builder, we recommend all types also derive ManifestSbor. #[derive(ScryptoSbor, NonFungibleData, ManifestSbor)] struct MyNonFungibleDataType { pub name: String, pub description: String, #[mutable] pub key_image_url: Url, } #[test] fn create_nf_resource() { let network = NetworkDefinition::stokenet(); let manifest_builder = ManifestBuilder::new() .lock_fee_from_faucet() .create_non_fungible_resource( OwnerRole::None, NonFungibleIdType::Integer, true, NonFungibleResourceRoles::default(), metadata!( init { "name" => "Example NF", locked; } ), // Change the below Some expression to None::>, // To output CREATE_NON_FUNGIBLE_RESOURCE (without initial supply) Some( indexmap! { NonFungibleLocalId::integer(1) => MyNonFungibleDataType { name: "hello world".to_owned(), description: "lorem ipsum".to_owned(), key_image_url: Url::of("https://assets-global.website-files.com/618962e5f285fb3c879d82ca/61b8f414d213fd7349b654b9_icon-DEX.svg"), }, } ) ); dump_manifest_to_file_system( manifest_builder.object_names(), &manifest_builder.build(), "./transaction_manifest", Some("create_non_fungible"), &network ).err(); } Example manifest output is as follows. Note that because it is automatically generated, it doesn’t use the manifest type aliases (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) which are typically used for manual creation. CALL_METHOD Address("component_tdx_2_1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxyulkzl") "lock_fee" Decimal("5000") ; CREATE_NON_FUNGIBLE_RESOURCE_WITH_INITIAL_SUPPLY Enum<0u8>() Enum<1u8>() true Enum<0u8>( Enum<0u8>( Tuple( Array( Enum<14u8>( Array( Enum<0u8>( 12u8 ), Enum<0u8>( 12u8 ), Enum<0u8>( 198u8 ) ) ) ), Array( Tuple( Enum<1u8>( "MyNonFungibleDataType" ), Enum<1u8>( Enum<0u8>( Array( "name", "description", "key_image_url" ) ) ) ) ), Array( Enum<0u8>() ) ) ), Enum<1u8>( 0u64 ), Array() ) Map( NonFungibleLocalId("#1#") => Tuple( Tuple( "hello world", "lorem ipsum", "https://assets-global.website-files.com/618962e5f285fb3c879d82ca/61b8f414d213fd7349b654b9_icon-DEX.svg" ) ) ) Tuple( Enum<0u8>(), Enum<0u8>(), Enum<0u8>(), Enum<0u8>(), Enum<0u8>(), Enum<0u8>(), Enum<0u8>() ) Tuple( Map( "name" => Tuple( Enum<1u8>( Enum<0u8>( "Example NF" ) ), true ) ), Map() ) Enum<0u8>() ; ## Simple Token Transfer URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-transactions/transaction-examples/simple-token-transfer Updated: 2026-02-18 Summary: A very common example of a transaction manifest is transferring tokens from one account to another. Build > dApp Development > dApp Transactions > Examples > Simple Token Transfer — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/transaction-examples/simple-token-transfer.md) A very common example of a transaction manifest is transferring tokens from one account to another. It's worth quickly seeing how to build the transaction manually as an example, but in practice: - Most users will have this kind of transaction built for them by their wallet. - Most integrators would use a tool like the Radix Engine Toolkit (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-manifest-builder.md) to build their transfer transactions. Contents of the transaction In the manifest, first we need to call lock_fee to pay for the transaction. Accounts are components, so even a token transfer happens by calling methods on the accounts of the sender and recipient, we therefore need to start by calling withdraw on our sender account. We then need to call deposit on the recipient account, but first we need to put the resource we’d like to deposit in a bucket. We can do this by creating a named bucket from the worktop with the TAKE_FROM_WORKTOP command, and then using that named bucket in the following deposit call. And conceptually that’s the manifest! But when building the transaction, we also need to consider authorization. To do that, we need to look at who can perform the account methods (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) : - Unsurprisingly, only the owner of the account can call lock_fee and withdraw, so we need to ensure we provide proof we are the owner in the auth zone (transaction-processor) . For pre-allocated accounts tied to a key pair, as commonly used by integrators, this will take the form of a “virtual signature proof” and mean we need to sign the transaction with the right key. For securified accounts, we will need to provide an owner proof from another component such as an access controller (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/access-controller.md) , which itself will require some combination of signature factors to allow us access. - Perhaps surprisingly, the owner of the account is also required to call deposit. This is because deposit will always succeed, regardless of the account’s resource deposit configuration (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) , as it requires the owner to sign off on the transaction. The deposit method should be used for dApp interactions where the owner will be present at transaction signing time. If you’re building a transaction to transfer resources without the owner being present, you will instead need to use a method starting try_deposit_*, such as try_deposit_or_abort. For further information, see the detailed description of the account methods (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) . Building the Transfer Manifest Manually We’ll start by using our account’s withdraw method to get 10 XRD from our account, which will return the XRD to the worktop. We’ll take it from the worktop and put it in a bucket, which we’ll then pass to the deposit method on the recipient’s account. The snippets below uses examples using the Gumball Machine and Radiswap. If you are following along the examples on your own, with exception of the known addresses, your local resim will likely produce its own unique addresses from what is shown in the snippets below. # Lock fees to pay for the transaction CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "lock_fee" Decimal("10"); # Withdraw 10 XRD from account, which goes to the worktop CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" ResourceAddress("[xrd_address]") Decimal("10"); # Take 10 XRD from the worktop and put it in a bucket TAKE_FROM_WORKTOP Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("10") Bucket("xrd"); # Try to deposit the bucket of XRD into the recipient's account. This will fail if they've disabled XRD deposits. CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "try_deposit_or_abort" Bucket("xrd") None; # Don't provide any authorized depositor badge If my account doesn’t actually have 10 XRD in it, then the transaction will fail on that first line. We can make this transaction manifest even simpler by using batch deposit functionality with Expression("ENTIRE_WORKTOP"), which sweeps up everything on the worktop and allows it to be deposited to an account. However - it's best to avoid using the Expression("ENTIRE_WORKTOP") command if you can help it - as the transaction cannot show up as well in wallets. Please see the guide on conforming manifests (conforming-transaction-manifest-types) for more information. # Lock fees to pay for the transaction CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "lock_fee" Decimal("10"); # Withdraw 10 XRD from account, which goes to the worktop CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("10"); # Take everything from the worktop (currently 10 XRD) and attempt to deposit it into the recipient's account. This will fail if they've disabled XRD deposits. CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") None; # Don't provide any authorized depositor badge ## Examples URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-transactions/transaction-examples/transaction-examples Updated: 2026-02-18 Summary: Legacy documentation: Examples Build > dApp Development > dApp Transactions > Examples — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/transaction-examples/README.md) ## Pre-authorization / Subintent Flow URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-transactions/pre-authorizations-and-subintents Updated: 2026-02-18 Summary: Pre-authorizations (the user-facing name for subintents(../../../reference/transactions/subintents.md)) allow a partial transaction to be authorized by one acto Build > dApp Development > dApp Transactions > Pre-authorization / Subintent Flow — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/pre-authorizations-and-subintents.md) Pre-authorizations (the user-facing name for subintents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) ) allow a partial transaction to be authorized by one actor such that it can then become a part of a larger atomic transaction. That larger transaction can then be submitted to the network by another actor. They are a powerful tool which can be used for a wide number of new use cases. Pre-authorizations launched on mainnet at the Cuttlefish (https://github.com/gguuttss/radix-docs/blob/master/updates/protocol-updates/cuttlefish.md) protocol update in December 2024. Introduction to Pre-authorization This video introduces the pre-authorization / preauthorization / preauth concept, and covers other information in this page. Pre-authorization Flow The following diagram compares the standard transaction request flow with the pre-authorization request flow covered in this article. In the dApp toolkit these are initiated with sendTransaction (https://github.com/radixdlt/radix-dapp-toolkit/blob/main/packages/dapp-toolkit/README.md#sendtransaction) and sendPreAuthorizationRequest (https://github.com/radixdlt/radix-dapp-toolkit/blob/develop/packages/dapp-toolkit/README.md#preauthorization-requests) respectively. The pre-authorization flow happens in four parts: - Pre-authorization request: A dApp front-end uses the dApp toolkit (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-toolkit.md) to send a pre-authorization request (https://github.com/radixdlt/radix-dapp-toolkit/blob/main/packages/dapp-toolkit/README.md#preauthorization-requests) to the wallet which includes a subintent (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) manifest stub, an expiry schedule and an optional message. - A subintent manifest stub must not include any fee locks - these are included by the transaction intent (transaction-intent) . - The Radix Wallet currently requires that pre-authorizations do not have children of their own - all interaction with other subintents (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) must go through its parent. - At execution time, a subintent does not start running until it is yielded to from its parent. The parent may pass the subintent buckets immediately, which end up on its starting worktop. - A subintent manifest must end with a YIELD_TO_PARENT instruction, and may include additional intermediate yields, if it needs to run logic before/after interaction with the rest of the transaction. - If possible, it's recommended to first withdraw/yield buckets and then deposit buckets at the end. This allows the most liquidity/flexibility in the rest of the transaction. - The manifest stub should include ASSERT_... instructions to ensure that the user ends up with at least the resources they expect. - User review and signing: The wallet shows a pre-authorization review for the user, and if they sign, passes it back to the dApp as a hex-encoded SignedPartialTransaction (which just contains a single signed subintent with no children). - If the subintent is self-contained (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) , it is shown with a preview-style review, which allows the user to add their own guarantees. - If the subintent has a "GeneralSubintent" classification (docs/conforming-manifest-types#01-general-subintent) then it displays statically-computable bounds on the user's account withdrawals and deposits. The wallet may add access controller calls to access the provided wallets, but will otherwise not change the manifest. The dApp developer will likely need to tweak the ASSERT_... instructions to ensure the bounds are as the user expects. These is detailed guidance on this under the GeneralSubintent (docs/conforming-manifest-types#01-general-subintent) conforming manifest type. - Propagation: The dApp front-end is then responsible for relaying the signed partial transaction to a back-end service which can build it into a transaction and submit it. - Depending on the use case, this may be the dApp's own backend, or it may be to a specific external intent matcher (such as Anthic) or a more general subintent aggregator/solver (which doesn't exist as of Cuttlefish launch). - Note that the subintent aggregation and propagation is out-of-band of the network. The network mempools only operate with transactions. - Each subintent aggregation service may have their own requests for metadata the dApp provides. Some may request that the manifest stub contains a VERIFY_PARENT instruction to ensure only their aggregator can be used. - Transaction construction and submission: The subintent aggregator finds/creates other intents to work with the subintent(s) it receives, builds up the transaction using a v2 partial transaction builder / transaction builder, previews the resulting transaction, and if happy, notarizes it and submits it to the network. - The v2 (partial) transaction builders (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-transaction-building.md) are available in the v1.3.0+ rust radix-transactions crate (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-libraries-overview.md) or in a variety of UniFFI Radix Engine Toolkits (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/README.md) . As of December 2024, they are not yet available in the Typescript Radix Engine Toolkit. - A v2 preview transaction can be created from the transaction builder, and previewed with the v2 preview endpoint on the Core API (https://radix-babylon-core-api.redoc.ly/#tag/Transaction/paths/~1transaction~1preview-v2/post) or Gateway API (https://radix-babylon-gateway-api.redoc.ly/#operation/TransactionPreviewV2) . Use Cases Delegated Fees In this use case, a dApp wishes to unconditionally pay fees for a user's interaction with their dApp, so that e.g. the user is not required to have any XRD in their account. The dApp acts as the subintent aggregator, in order to pay fees for a user's subintent: - The dApp sends the user a pre-authorization request with a subintent manifest to interact with their app, much like a transaction request. - To get a typical preview-based review experience where the user can set guarantees, the subintent manifest stub should ideally be self-contained (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/subintents.md) . If not, the user will get a static review based on guaranteed deposit amounts. - The dApp backend would then: - Decompile the subintent to verify it is doing exactly what the dApp expects and so the dApp is comfortable paying fees for the user - Wrap it in a transaction intent, where the transaction intent locks a fees from the dApp account - Preview the transaction to ensure it is valid (e.g. that the user has the claimed resources) - Sign/notarize with keys for the dApp account - Submit the transaction to the network User Badge Deposit A dApp wishes to deposit a user badge to a user's account where they may have configured their account to reject deposits of new resources (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) . In this case, the dApp acts as the subintent aggregator and: - Proposes a subintent manifest stub which: - Receives the dApp badge at the start of execution - Deposits it to their account - Optionally it may also add a VERIFY_PARENT so that the user has confidence that this manifest can only be used by the dApp itself. - YIELDs back to parent to finish. Optionally it can also withdraw and a small bucket of XRD back to the dApp in the YIELD, to cover fee costs. - The dApp front-end then passes the signed partial transaction to their backend, for checking, wrapping, signing and submitting, as per the delegated fee payment use case. Intent-based partial trades In this use case, a dApp wishes to allow a user to take one side of a trade, possibly involving interacting with their dApp. They can their share the subintent with an aggregation network which can hopefully find the other side of the trade for the user. - A dApp proposes a subintent which: - Withdraws resources from a user's account - Optionally interacts with a dApp component, and/or VERIFY_PARENT to ensure only certain subintent aggregator/s can be used - Calls YIELD_TO_PARENT with some buckets of resources (including some fee payment) - Deposits explicit returned buckets from the YIELD back to the user's account - This subintent can be passed to an external subintent aggregator network to be paired with other subintents/intents which can "take the other side" of the trade. Anthic example To take Anthic (https://www.anthic.io/) ( primer blog (https://www.radixdlt.com/blog/introducing-anthic-the-first-intent-based-dex-on-radix/) | integration docs (https://docs.anthic.io/integration/dex/example) ) as a particular example: - Any DEX dApp can integrate Anthic. They call to the Trade API (https://docs.anthic.io/api/trade_api/) to get details and the current order book in order to construct a transaction as per the worked example (https://docs.anthic.io/integration/dex/example#compute-limit-order) . - The dApp constructs a subintent manifest stub in a particular Anthic-compatible format (https://docs.anthic.io/concepts/subintents) and sends to the user's wallet. - The dApp then submits the signed partial transaction from the user with some metadata to the Anthic Trade API. - Anthic attempts to find a matching trade with market makers, gets a signed partial transaction from them. It combines these into a complete transaction and submits it to the network. Co-ordinated ticket purchase In this use case, a user wishes to use a trusted dApp to co-ordinate a ticket purchase with a friend, and ensure that they either both end up with a ticket, or neither person ends up with a ticket. In this case, the dApp acts as the subintent aggregator and: - Proposes a separate subintent to each user for their purchase - Combines the resulting signed partial transactions into a single transaction, which must commit atomically. - Passes the transaction to its backend for checking, wrapping, signing and submitting as per the delegated fee payment use case. ## Writing Manifests URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-transactions/writing-manifests Updated: 2026-02-18 Summary: Transaction manifests are human-readable list of instruction that are followed by the Radix engine to transfer resources or otherwise change state. Build > dApp Development > dApp Transactions > Writing Manifests — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/writing-manifests.md) Transaction manifests are human-readable list of instruction that are followed by the Radix engine to transfer resources or otherwise change state. See the Transaction Manifest Model (transaction-manifest) and further reading on how to think about the transaction manifest and transaction layer. Concepts Fee payment To run a transaction on the Radix network, you have to pay a fee depending on the number of instructions you are calling and the permanent storage used. You do this by locking XRD from a vault that will then be used to pay for the fee calculated at the end of a transaction. To lock a fee, you must call a method that essentially calls .lock_fee(amount) on a vault containing XRD. There is a method that do this on the Account component: CALL_METHOD Address("[account_component_address]") "lock_fee" Decimal("[amount]"); For testing purposes, on the local simulator, you can also call the lock_fee on the System component (address component_sim1qftacppvmr9ezmekxqpq58en0nk954x0a7jv2zz0hc7q8utaxr) so the fee payment comes from the system rather than your account. You can read more about fees here (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/costing-and-limits/transaction-costing.md) . The Worktop Each transaction has a worktop that is a place that resources may be held during the transaction execution. Resources returned from component calls are automatically put onto the worktop. From there, the manifest may specify that resources on the worktop be put into named buckets so that those buckets may be passed as arguments to other component calls. The manifest may also use ASSERT commands to check the contents of the worktop, causing the entire transaction to fail if not enough of the checked resource is present. This is useful to guarantee results of a transaction, even if you may be unsure of what a component may return. Of course we know that all resources must be in a vault by the end of any transaction, so the transaction manifest creator must ensure that no resources are left hanging around the worktop or in buckets by the end of the manifest’s steps. The Authorization Zone Another key concept is the authorization zone. The authorization zone acts somewhat similar to the worktop, but is used specifically for authorization. Rather than holding resources, the authorization zone holds proofs. A proof is a special object that proves the possession of a given resource or resources. When a component method or blueprint function is called by the transaction manifest, the proofs currently in the transaction’s authorization zone are automatically used to validate against the authorization rules defined in that method/function’s role assignments. If this check fails, the transaction is aborted. Proofs can enter the auth zone from two places: - Signatures on the transaction are automatically added to the authorization zone as "virtual signature proofs". This, for example, is how you are able to call the withdraw method on a pre-allocated account component which is still set up to require a proof of a signature with its corresponding public key hash. - Proofs can also be returned by calls to methods. They are automatically added to the authorization zone by the transaction processor (transaction-processor) . For more about proofs and authorization, please see the Authorization Model (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/authorization-model.md) . Tooling There are various tools for building transaction manifests for testing, learning and interacting with your components: - For simulators / learning: - Use the Radix Engine Simulator resim - this allows running transactions to perform actions against a simulated ledger, and also supports outputting example manifests for any actions it can perform. - For crafting more complex manifests: - Use the Rust Manifest Builder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) - this can allow building and outputting arbitrary manifests to a file easily, including complex manifests such as those creating Non Fungible Resources. - Use the Manifest Instructions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) as reference to write manifests manually. - For front-end dApp builders: - Use template strings for simple manifests, or the Typescript Radix Engine Toolkit (https://github.com/radixdlt/radix-dapp-toolkit?tab=readme-ov-file#build-transaction-manifest) for more complex manifests - as per the docs in the Radix dApp Toolkit (https://github.com/radixdlt/radix-dapp-toolkit?tab=readme-ov-file#build-transaction-manifest) . - For integrators looking to programmatically create manifests or transactions: - If writing in Rust, use the Rust Manifest Builder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) . - If writing in other languages, use the Radix Engine Toolkit (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-manifest-builder.md) . Using Radix Engine Simulator resim The resim CLI supports basic method (or function) invocation transactions. Any transaction resim supports can be outputted as a transaction manifest without committing it to the simulated ledger. To use the feature, add --manifest flag to your resim call-function , resim call-method commands (or any other command that would create a transaction). For example: resim call-function package_sim1qy2f2f63ddpw8x40m55ytpru428vk66edhdpvz60ljqsj5y7v8 Hello instantiate_hello --manifest out.rtm will produce something like: CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "lock_fee" Decimal("10"); // #1 CALL_FUNCTION Address("package_sim1qy2f2f63ddpw8x40m55ytpru428vk66edhdpvz60ljqsj5y7v8") "Hello" "instantiate_hello"; // #2 CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "deposit_batch" Expression("ENTIRE_WORKTOP"); // #3 - To pay for the transaction fees, your transaction must contain a call to a method locking fees. For testing purposes, this example calls the lock_fee method on the System component. In practice, you would call the lock_fee method on your account’s component. - This line calls the instantiate_hello function on the Hello blueprint - This line takes all the resources present on the worktop, puts them in buckets and send them back into your account. In this case it’s not important since no resources are returned from the instantiate_hello function. You can edit the auto-generated manifest to modify or add any other instructions you like. Using the Rust Manifest Builder See Rust Manifest Builder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) . Writing Transaction Manifests Manually Once you’re familiar enough with the transaction manifest syntax you can of course write manifest instructions from scratch. Please see the full instruction list and specification (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) to learn more. It is also very common to use instructions that call methods on Native Blueprints (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/README.md) . You will find blueprint methods listed in their specification (e.g. Account (../../../reference/radix-engine/native-blueprints/account.md#blueprint-api-function-reference) ) where you will also find links to their Rust Docs (https://docs.rs/scrypto/1.2.0/scrypto/component/) . Examples "Composed" Multi-Component Transaction Now let’s get a little more advanced by composing together calls to multiple components, using the resources returned from each to feed into the next. We’ll take 1 XRD from our account, call another component to buy a gumball, and then swap that gumball in a Radiswap component for some other resources (we won’t check on what we got back; we’ll just deposit whatever it is back into our account). First, we’ll use some placeholders for readability: # lock fees to pay for the transaction CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "lock_fee" Decimal("10"); # withdraw 1 XRD from account, which goes to the worktop CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" Address("[xrd_address]") Decimal("1"); # take 1 XRD from the worktop and pass it to the gumball machine TAKE_FROM_WORKTOP Address("[xrd_address]") Decimal("1") Bucket("xrd"); CALL_METHOD Address("component_sim1qve43kxmshr40xakn6akj5amzc3walsdyqdgr8ermncq8uvy97") "buy_gumball" Bucket("xrd"); # take all returned gumballs and do a radiswap TAKE_FROM_WORKTOP Address("resource_sim1qye43kxmshr40xakn6akj5amzc3walsdyqdgr8ermncqzz5l6k") Decimal("1") Bucket("gumballs"); CALL_METHOD "swap" Address("component_sim1q0d9pmtn6xsrsqkdxlzyjrdnc9n94n9fma3jtrrehymst2rv4k") Bucket("gumballs"); # deposit everything into my account CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "deposit_batch" Expression("ENTIRE_WORKTOP"); If you want to see the same transaction with some actual addresses from a local run, here you go: # lock fees to pay for the transaction CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "lock_fee" Decimal("10"); # withdraw 1 XRD from account, which goes to the worktop CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("1"); # take 1 XRD from the worktop and pass it to the gumball machine TAKE_FROM_WORKTOP Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("1") Bucket("xrd"); CALL_METHOD Address("component_sim1qve43kxmshr40xakn6akj5amzc3walsdyqdgr8ermncq8uvy97") "buy_gumball" Bucket("xrd"); # take all returned gumballs and do a radiswap TAKE_FROM_WORKTOP Address("resource_sim1qye43kxmshr40xakn6akj5amzc3walsdyqdgr8ermncqzz5l6k") Decimal("1") Bucket("gumballs"); CALL_METHOD Address("component_sim1q0d9pmtn6xsrsqkdxlzyjrdnc9n94n9fma3jtrrehymst2rv4k") "swap" Bucket("gumballs"); # deposit everything into my account CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "deposit_batch" Expression("ENTIRE_WORKTOP"); ## dApp Transactions URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-transactions/dapp-transactions Updated: 2026-02-18 Summary: This section focuses on how to think about and construct transactions as a dApp developer. It's best to start with the Writing Manifests(writing-manifests.md) a Build > dApp Development > dApp Transactions — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/README.md) This section focuses on how to think about and construct transactions as a dApp developer. It's best to start with the Writing Manifests (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/writing-manifests.md) article. A complete description of transaction can be found in the Essentials (https://github.com/gguuttss/radix-docs/blob/master/essentials/README.md) section Transactions on Radix (https://github.com/gguuttss/radix-docs/blob/master/essentials/transactions-on-radix.md) For more details on the transactions technical model, please see the Technical Model (technical-model) sections on Transactions. ## dApp Definition Setup URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/dapp-definition-setup Updated: 2026-02-18 Summary: A dApp Definition Account serves two purpose: Build > dApp Development > Application Stack > dApp Definition Setup — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-definition-setup.md) A dApp Definition Account serves two purpose: - Providing an on-ledger registration to establish relationships between your dApp’s entities (packages, components, resources) (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) - Providing an on-ledger registration for a dApp (frontend) verification to ensure user interacting with your dApp is authentic. (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) The role of the dApp Definition Account is visibly most meaningful in the Radix Wallet to inform your users how entities of your dApp are related to each other. Most importantly, it helps authenticate your dApp’s website (and other entities) to ensure your users are not fooled by fake representation of your dApp. In essence, the dApp Definition Account acts as a hub which connects all the parts of your dApp together. Setting up a dApp Definition using Dev Console Ideally, a dApp Definition account should be created after you have built your dApp’s components and resources, and created a website front end for it. - Create a new account in the Radix Wallet. This is the account which we will convert to a dApp Definition account. - If you’re just learning Head to the Stokenet Developer Console (https://stokenet-console.radixdlt.com/configure-metadata) . “Configure Metadata” page provides a simple interface to set the metadata on an account to make it a dApp Definition. If you are ready for production make sure to set up your dApp Definition on the Mainnet Dev Console (https://console.radixdlt.com/configure-metadata) . - Connect your Radix Wallet to the Dev Console. Share the account you’re about to convert into dApp definition. - Go to “Configure Metadata” page - Fill in the account address to entity search input. You can copy it from the list of accounts you’ve already shared. - Click “Search” button or hit “Enter”. - You will see initial form with one “account_type” field and “Badges” section. Change value of “account_type” field to “dapp definition”. - Fill in the name, description, tags and icon_url. These fields are standard display metadata (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) entries for account entity. Values of these fields will be displayed when your dApp sends requests to wallet - In order to set metadata for verification (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) , you need to fill claimed_websites and claimed_entities fields. - claimed_websites - Here, you can claim ownership of your dApps website(s) for authenticity. This is confirmed by looking up an expected .well-known/radix.json file at the claimed website origin. This will be required for your dApp to successfully send requests to the Radix Wallet at Mainnet. - claimed_entities - Here, you can claim ownership of resources, components, and packages. - Click “Send to the Radix Wallet” - An approve transaction should appear in your Radix Wallet to confirm! When you set up the Radix dApp Toolkit (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-toolkit.md) in your dApp frontend website, you’ll configure it with the dApp Definition address that you just created, and it will be sent to the Radix Wallet whenever a user connects or receives a transaction from your dApp. The Wallet will then look up that dApp Definition address on the Radix Network, pull the latest metadata, and show it to the user. When a user logins to your dApp, an entry in the wallet’s preferences for your dApp will appear too. Claimed Entities You can use “Standard Metadata” page to link resources or components back to dApp definition account. This way you will setup correct 2-way link which can be verified by Radix Wallet. In order to do that, paste address of a component or a resource to initial entity search input. You’ll see a predefined set of standard metadata fields for given entity. Fill them and send transaction to the Radix Wallet, the same as it was done for dApp definition account. Badges Sometimes setting metadata on a component or a resource is restricted only to people who hold a certain badge. You can use badge section in “Configure Metadata” page to proof that you do own some resources. Selecting fungible token from the list will create a proof that you own at least one. Selecting non fungible resource will allow you to present particular NFT from collection. This way you can change metadata on entities with more sophisticated rules for updating metadata ## ROLA - Radix Off Ledger Auth URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/rola-radix-off-ledger-auth Updated: 2026-02-18 Summary: ROLA is method of authenticating something claimed by the user connected to your dApp with the Radix Wallet. It uses the capabilities of the Radix Network to ma Build > dApp Development > Application Stack > ROLA - Radix Off Ledger Auth — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/rola-radix-off-ledger-auth.md) ROLA is method of authenticating something claimed by the user connected to your dApp with the Radix Wallet. It uses the capabilities of the Radix Network to make this possible in a way that is decentralized and flexible for the user. ROLA is intended for use in the server backend portion of a Full Stack dApp. It runs "off-ledger" alongside backend business and user management logic, providing reliable authentication of claims of user control using "on-ledger" data from the Radix Network. Learn more about Pure Frontend dApps and Full Stack dApps at What is a dApp on Radix? (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/setting-up-for-dapp-development.md) Uses of ROLA There are two kinds of authentication ROLA is designed for: - Authenticating a user’s login using a Persona - Authenticating a user’s control of an account on Radix In short, these two cases are solved by ROLA authenticating a cryptographic proof of control of an entity on the Radix Network: - Proof of control of an Identity component - Proof of control of an Account component If a user’s Radix Wallet can produce a proof of control of an Identity component at a given address, and ROLA authenticates it, then that user may safely be considered to be logged in. The Identity’s address may be used as the unique identifier for that user. If a user’s Radix Wallet can produce a proof of control of an Account component at a given address, and ROLA authenticates it, then that user may safely be considered to be the owner of that account. They may also be safely considered the owner of the assets contained within the account. How ROLA Works ROLA is somewhat similar to the PassKeys system of FIDO authentication (https://fidoalliance.org/passkeys/) , but leverages the existence of a safe decentralized network to enable cycling of public keys used for authentication, rather than relying on fixed public keys and a cloud backup of the corresponding private key. ROLA works on the expectation that all Identity and Account components on the Radix Network include a piece of metadata that defines an array of public key hashes as the owner_keys for that component, where any of the corresponding keys could sign to prove ownership of the account/identity. A single public key hash is set automatically on creation of a preallocated account/identity, corresponding to the private key that created the component. The user (assisted by the Radix Wallet) may change it in the future to enable convenient multi-factor recovery of control of accounts and identities. Then the typical workflow for ROLA authentication is this: - The dApp backend creates a challenge (with a limited time of validity) and passes it to the frontend. - The dApp frontend makes a request to the Radix Wallet (using √ Connect Button or Wallet SDK) for either a login or account(s) address(es) with required proof of ownership. The challenge is included in this request. - The user selects the Persona or Account(s) requested. - The user’s Radix Wallet produces a cryptographic signature using the private key corresponding to one of the configured public key hashes in the owner_keys metadata entry. It returns to the dApp the address, challenge, public key, and signature. - The dApp frontend then passes that information to ROLA in the dApp backend. - ROLA checks that the challenge is still good, and that the address, public key hash, and signature match for the current state of the account/identity component. If so, correct proof has been provided and the dApp backend and frontend may act accordingly (perhaps considering that user logged in and creating an active sessions in the user’s browser). Installation and Usage Please check ROLA examples (https://github.com/radixdlt/rola) for more documentation and an end-to-end implementation of ROLA using a express server as backend and a TypeScript dApp as client. And please check out the ROLA npm library (https://www.npmjs.com/package/@radixdlt/rola) for use in building Node.js backends using ROLA. Finally, some Community APIs (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) offer delegated ROLA-as-a-service, which could be a good choice assuming you trust the given service with the responsibility to authenticate users on your behalf. ## Gateway SDK URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/dapp-sdks/gateway-sdk Updated: 2026-02-18 Summary: The TypeScript Gateway SDK provides a simple way of querying "state" on the Radix Network. Build > dApp Development > Application Stack > dApp SDKs > Gateway SDK — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/gateway-sdk.md) The TypeScript Gateway SDK provides a simple way of querying "state" on the Radix Network. Web3 applications often need information about the current (or past) state of things on the Radix Network. Things such as: - What resources are currently held in a given account? - What is the configuration and metadata of a given resource? - What happened in a given transaction? - What is the status of a given component? The Gateway SDK is a thin wrapper around the Gateway API (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) , documentation for this API can be found here (https://radix-babylon-gateway-api.redoc.ly/) . Note if you are already using the Radix dApp Toolkit, it exposes an interface for this and you do not need to import seperately. Installation and Usage See the Gateway SDK package on NPM For Installation and Usage documentation. Gateway SDK NPM (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) | Github (https://github.com/radixdlt/babylon-gateway/tree/main/sdk/typescript/) ## dApp Toolkit URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-toolkit Updated: 2026-02-18 Summary: The Radix dApp Toolkit wraps together the Wallet SDK and Gateway SDK(gateway-sdk.md) along with Build > dApp Development > Application Stack > dApp SDKs > dApp Toolkit — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-toolkit.md) The Radix dApp Toolkit (https://github.com/radixdlt/radix-dapp-toolkit/tree/main) wraps together the Wallet SDK (https://github.com/radixdlt/wallet-sdk) and Gateway SDK (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/gateway-sdk.md) along with a framework agnostic √ Connect Button (https://github.com/radixdlt/connect-button) web component. Together they make it easy for developers to connect users and their Radix Wallet to their dApps. √ Connect Button The √ Connect Button appears as a consistent, Radix-branded UI element that helps users identify your dApp website as a Radix dApp and compatible with the Radix Wallet – and it automatically provides a consistent user experience for users to connect with their wallet and see the current status of the connection between dApp and Radix Wallet. Installation and Usage See GitHub for full documentation and installation instructions: - Radix dApp Toolkit Github (https://github.com/radixdlt/radix-dapp-toolkit#readme) - Radix dApp Toolkit NPM (https://www.npmjs.com/package/@radixdlt/radix-dapp-toolkit) ## dApp SDKs URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-sdks Updated: 2026-02-18 Summary: Legacy documentation: dApp SDKs Build > dApp Development > Application Stack > dApp SDKs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/README.md) ## Building a Full Stack dApp URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/building-a-full-stack-dapp Updated: 2026-02-18 Summary: A Full Stack dApp adds a traditional server backend component to do things that a pure frontend dApp(building-a-frontend-dapp.md) can’t. Build > dApp Development > Application Stack > Building a Full Stack dApp — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-full-stack-dapp.md) Overview A Full Stack dApp adds a traditional server backend component to do things that a pure frontend dApp (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-frontend-dapp.md) can’t. For an overview of dApp types on Radix, see What is a dApp on Radix (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/README.md) ? Here’s the big picture of a full stack dApp. You’ll see the portion on the left is identical to the pure frontend dApp, but now a server backend portion has been added to the right, including new tools for you to use. The green blocks below are built by you, while the blue blocks are existing tools for you to make use of. See more about ROLA (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/rola-radix-off-ledger-auth.md) for authentication of logins and ownership of accounts from the Radix Wallet. More information and walkthrough coming soon…​ ## Building a Frontend dApp URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/building-a-frontend-dapp Updated: 2026-02-18 Summary: A Pure Frontend dApp is the simplest kind of dApp. It has no server backend portion at all – it is a website that connects to a user’s Radix Wallet, gets data f Build > dApp Development > Application Stack > Building a Frontend dApp — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-frontend-dapp.md) Overview A Pure Frontend dApp is the simplest kind of dApp. It has no server backend portion at all – it is a website that connects to a user’s Radix Wallet, gets data from the Radix Network, and proposes transactions to the wallet. For an overview of dApp types on Radix, see What is a dApp on Radix (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/README.md) ? It’s useful to start with the big picture of the various tools available to you, and how they connect together. The green blocks below are built by you and the blue blocks are existing tools for you to make use of. Your dApp website project will likely include the Radix dApp Toolkit (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-toolkit.md) , which provides a useful combined interface to a √ Connect Button, Wallet SDK, and Gateway SDK. These tools provide a seamless way to facilitate the interaction between the user and your dApp with the Radix Wallet. Let’s break each specific elements within the Radix dApp Toolkit: The √ Connect Button provides a common, recognizable, Radix-branded UI "button" element that allows your users to easily see that they can connect their Radix Wallet to your website, and provide a familiar experience in doing so. The Wallet SDK is the workhorse for interacting with the Radix Wallet. It provides an interface for making requests to the wallet for various data held there for the user, including account addresses and more – if the user permits it. It also provides an interface for submitting transactions to the Radix Wallet, which then asks the user to review and submit it. The Gateway SDK is how your dApp can see what’s happening on the network and what the state of things there is. A common pattern would be to ask the Gateway for the resources held in an account address provided to your dApp by the user’s wallet (in response to a Wallet SDK request). Your dApp might use that to show how many tokens they have of a given type to interact with your dApp. In addition to this, your frontend dApp may also have an "on-ledger backend" portion on the Radix Network itself. For example, you can build your own powerful automation for assets as a component, written in Scrypto. Moreover, you can create assets and tokens of your own as "resources", too. With this set up, third parties may have existing components and resources of their own on the Radix Network that can interact with your frontend as well. After you’ve built your website, components, and resources that make up your dApp, you will need a way for the Radix wallet to understand the relationship between each piece. To do this, you can set up a dApp Definition account (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-definition-setup.md) . The dApp Definition account provides an on-ledger unique identifier that each parts of your dApp can point to, to establish each other’s relationship. This provides your users a rich experience to interacting with your dApp. When users interact with your dApp, your dApp interacts with its components (and/or components created by others) by building a transaction manifest (transaction-manifest) . These transaction manifests are composed of the user’s intent which describe how assets move between user accounts and one or more components to perform potentially complex and powerful action. The transaction manifest is then submitted to the Radix Wallet (via Wallet SDK) for the user to review and accept - confirming their intentions. To get a feel for how these three parts of the dApp interact, imagine a simple dApp that presents a gumball machine to the user where they can insert XRD tokens, and receive GUM tokens in return. To create the dApp, the developer does the following: - Requests the creation of a GUM token as a “resource” from the Radix network, setting metadata on it like symbol (GUM), name, description, etc. that allow it to be displayed nicely in the Radix Wallet. (No code needed here.) - Creates and deploys to the Radix Network a GumballMachine component (in Scrypto) that accepts XRD and returns GUM from an internal supply. - Creates and hosts a dApp Frontend website that graphically displays how many gumballs are available and lets the user connect their Radix Wallet to buy them. The user flow might look like this: - The user loads up the dApp Frontend website in their browser. As part of loading, the website sends a query to the Radix Network (via a Gateway) to find out how many GUM tokens are held in the GumballMachine component’s internal vault. The website uses this to build its display of the gumball machine. - The user clicks the √ Connect button in the corner of the website, which automatically brings up a request (specified by the website) in their Radix Wallet app for the addresses of any accounts that they might want to buy gumballs from. The user selects 3 accounts and approves. - The website updates to show the user as connected, and shows an account selector menu where they can pick which of their 3 accounts they want to buy a gumball from right now. The website queries the XRD balances on the accounts and sees that one of them has no XRD to spend, and so this account is grayed out. - The user picks an account and clicks a Buy Gumball button on the website. - The website builds a transaction manifest stub that specifies that 1 XRD should be withdrawn from the user’s account, this should be deposited to a buy_gumball method of the GumballMachine component, and whatever it returns should be deposited back to the user’s account. This transaction manifest is passed to the user’s Radix Wallet. - The Radix Wallet shows a friendly view of the transaction to the user, showing 1 XRD leaving their account and (according to a simulation of the result) 1 GUM entering their account at the end. The wallet automatically adds a small network fee payment to the transaction. - The user, satisfied with what has been proposed, clicks Approve in their Radix Wallet and the transaction is swiftly signed (to authorize the withdrawal of 1 XRD) and submitted to the Radix Network, where it is accepted and the result committed to the ledger. - The transaction successful, the website shows an amusing animation of a gumball being dispensed – and the user will now see the GUM token in their wallet. With a very simple system, the dApp has now performed a real exchange of digital assets with the user! Pure Frontend dApp Walkthrough To see first-hand how to build a pure frontend dApp with these tools, see the following walkthrough on building a simple "gumball machine" dApp, including a Scrypto-based component, and a simple website that connects to the Radix Wallet and Radix Network: Continue to gumball machine walkthrough on GitHub (https://github.com/radixdlt/scrypto-examples/tree/main/full-stack/dapp-toolkit-gumball-machine) ## Application Stack URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/dapp-application-stack/dapp-application-stack Updated: 2026-02-18 Summary: The emerging field of Web3 (and, closely related, decentralized finance or “DeFi”) offers exciting opportunities for developers to build a new class of decentra Build > dApp Development > Application Stack — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/README.md) https://www.youtube.com/embed/_RaFcIuz1h8?feature=shared (https://www.youtube.com/embed/_RaFcIuz1h8?feature=shared) Introduction The emerging field of Web3 (and, closely related, decentralized finance or “DeFi”) offers exciting opportunities for developers to build a new class of decentralized applications (or “dApps”) around meaningful and valuable digital assets and identity. The concept of the Web3 model is that users stay in control of what they own, who they are, what they share, and what they choose to bring with them to applications – and application builders like you create powerful new ways for those users and what they own to interact. Whether we’re talking about a finance application that previously would have required access to trans-national banking infrastructure, or a game where users can take their characters and loot with them when they log off – Web3 and DeFi dApps should be able to simply do things better, and do things that were never possible before. Much of this potential, however, has not yet been made practically accessible to developers due to shortcomings in existing Web3 and DeFi platforms. Radix offers a decentralized platform and a set of tools that gives you the most purpose-built place to create this new generation of Web3 dApps. On Radix, digital assets are easy to create and manage, smart contract logic is intuitive to write and make safe, and the Radix Wallet enables seamless and powerful connections to users and their assets through your web interface. To use the Radix platform and its tools, it’s important to understand the unique structure of a Radix dApp and how the pieces fit together. The Anatomy of a Radix dApp Building your dApp means more than just building a simple website. That website will need to connect to users’ Radix Wallet, and it will likely want to connect to a Gateway for the Radix Network to get information on the state of accounts, assets, and smart contracts. You may want to build your own smart contract automation on the Radix Network to sit behind your web interface – or understand how to interact with smart contracts already deployed there by other developers. And like a traditional Web2 app, you may want a server-based portion of your application for things like user management or specialized business logic. To understand the anatomy of a Radix dApp, we can split them into two broad categories: the pure frontend dApp, and the full stack dApp. Pure Frontend dApps If you’ve previously built dApps on Ethereum, this type might be somewhat familiar; something like Uniswap would be in this category. If you’re new to Web3 and DeFi, this might seem a little strange. Because decentralized ledger networks like Ethereum and Radix can store and run powerful asset automation in the form of “smart contracts”, a web application may simply provide a UI to those smart contracts and not use a traditional server-side backend at all. Of course such a dApp can have no traditional server-side user management. It can be thought of as just a program that the user runs locally in their web browser that talks to the wallet and the network only. On Radix, this type of dApp is structured like so: The portions in blue are existing things from Radix that you can make use of. The portions in green are your responsibility as a dApp developer. There are three major parts to this type of dApp: - The Radix Wallet - You, the developer, don’t have to build anything at all here. It’s the place where users control their assets, and where they’ll give approval for things whenever the dApp wants to interact with their accounts, personas (logins + personal data), or assets. - The dApp Frontend - This is your dApps’s user interface. It could be very lightweight (a simple UI) or quite heavy (a game built off-chain and is interacting with the Radix Network for in-game asset ownership). The Radix dApp Toolkit which provides a useful combined interface to a √ Connect Button, Wallet SDK, and Gateway SDK which makes it easy to handle user log ins in to your dApp, retrieve information to your dApp from their wallet, and automatically provide the user with seamless communication between the dApp and the wallet. - The Radix Network - This is where all accounts and assets live. It is also where transactions can interact with smart contracts, which on Radix are called “components”. You might write your own components in Scrypto to run in that shared trustless network environment – or you might choose to interact with components deployed by others there. Your dApp may also want to access data about the current state of the network - things like how many tokens are in a given account, or what the status of a given component is. dApps can query the network using a Gateway endpoint. If you’re thinking about building a pure frontend dApp, continue to Building a Pure Frontend dApp section (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-frontend-dapp.md) . Full Stack dApps This type of dApp keeps everything we had in the pure frontend dApp, but introduces a traditional server-side backend for more complex applications and user-personalized experiences. Think of it as combining the richness of the best of Web2 applications with the Web3 superpowers of digital assets and personal control of data. This type of dApp on Radix is structured like so: The new dApp backend part of the system has two new functions that interact with the Radix Network: - User Management - This is where you keep track of users that login to your system so you can personalize their experience. Radix’s Persona logins allow simple one-tap no-password logins using the Radix Wallet with on-ledger Identities. ROLA (Radix Off-Ledger Authentication) makes this easy. - Asset Management - This is something you might want if your dApp needs to directly submit transactions to the network itself rather than proposing transactions to a user’s Radix Wallet to sign and submit. Simple examples of this would be managing the dApps’ own internal reserves of tokens or configuring its components. Once again the Gateway SDK makes it simple to interact with a network Gateway for either queries needed by the backend, or to submit such transactions. If you come from the Web2 world, this type of dApp may feel more familiar. It’s Web2, but with new superpowers: - Your application can seamlessly interact with digital assets and asset-oriented smart contract logic on the Radix Network to create more valuable experiences and connections to other applications. - Users have a secure, no-password mechanism to login using the Radix Wallet. - Rather than having to store user personal data in your own database (both dangerous and objectionable to users), access personal data directly from the user’s Radix Wallet – with their permission, as needed. If you’re ready to get started creating dApps on Radix, continue on to Building your first Scrypto Component (installation) and then Building a Pure Frontend dApp (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-frontend-dapp.md) and Building a Full System dApp (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-full-stack-dapp.md) . ## dApp Development URL: https://radix.wiki/developers/legacy-docs/build/build-dapps/build-dapps Updated: 2026-02-18 Summary: Legacy documentation: dApp Development Build > dApp Development — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/README.md) ## Wrapping Up the Learning Step-by-Step URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/wrapping-up-the-learning-step-by-step Updated: 2026-02-18 Summary: We've made it to the end of this step-by-step learning journey. You now know enough to start making your own dapps for the Radix network and help grow our DeFi Build > Learning Step-by-Step > Wrapping Up the Learning Step-by-Step — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/wrapping-up-the-learning-step-by-step.md) We've made it to the end of this step-by-step learning journey. You now know enough to start making your own dapps for the Radix network and help grow our DeFi ecosystem, though there is much more to learn still. Make sure you have a look at the Before You Release (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/README.md) docs pages before releasing any of your code into the wild. You find advice for ensuring the security and intended behaviour (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/code-hardening.md) of your dapps there, as well as suggestions for optimisation and readiness for release checks (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/before-you-release/productionize-your-code.md) How did you find the Learning Step-by-Step? As you've made it to the end of this journey, please tell us if it was helpful. You can do that by clicking one of the buttons below ⬇ and leaving any feedback you'd like to. We'd love to know if there are any ways to improve these articles. ## Run the Radiswap dApp URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/run-the-radiswap-dapp Updated: 2026-02-18 Summary: The Radiswap dApp is the last example in the step-by-step learning journey. It takes the concepts learned in the previous sections and combines them with some n Build > Learning Step-by-Step > Run the Radiswap dApp — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/run-the-radiswap-dapp.md) The Radiswap dApp is the last example in the step-by-step learning journey. It takes the concepts learned in the previous sections and combines them with some new additions to make a single, more complex demonstration of what you can do with Scrypto. The Radiswap dApp is a decentralized exchange (DEX) that allows users to deposit and swap between two tokens. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/blob/main/step-by-step/21-radiswap-dapp) . The Radiswap Scrypto Package Two Resource Pools The Radiswap package is a customised wrapper around the standard TwoResourcePool native blueprint with the addition of a swap function. There are a range of native pool blueprints (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/pool.md) available in the Radix Engine, and the TwoResourcePool is one of the most commonly used. It does what you would expect and holds two resources allowing users to deposit and withdraw from the pool in exchange for Pool Unit resources (often called LP tokens). This is a useful part of the functionality of many dApps and these functions are accessed with the add_liquidity and remove_liquidity methods. In our case the Radiswap blueprint extends the pool blueprint's methods by also allowing users to swap between the two resources in the pool. The Swap Method The swap method accepts an input amount of one resource and returns an output amount of the other resource. pub fn swap(&mut self, input_bucket: Bucket) -> Bucket { The exchange rate is determined by comparing the size of the resource pools with a formula used in many automated market makers (AMMs), a constant product formula that looks like this: output_amount = input_amount * (output_reserves / (input_reserves + input_amount)) The checked math (../build-dapps/before-you-release/code-hardening.md#pay-special-attention-to-decimal-operations) version of this becomes: let output_amount = input_amount .checked_mul(output_reserves) .unwrap() .checked_div(input_reserves.checked_add(input_amount).unwrap()) .unwrap(); If you aren't familiar with the formula, you can find out more about it and how it's used by looking up AMMs. Radiswap Component Instantiation To instantiate a Radiswap component we need to provide 3 arguments: - owner_role - what rule defines the component owner - resource_address1 - the first resource in the pool - resource_address2 - the second resource in the pool - dapp_definition_address - the address of the dapp definition account The first and last of these are new to us. Customizing the Owner Role in a Transaction Manifest The first is simply the full owner role declaration that we've been declaring in blueprints before, usually either as OwnerRole::None or using the rule! macro and some resource address, e.g. OwnerRole::Fixed(rule!(require( owner_badge.resource_address() ))) Now that we've made it an argument we'll need to provide the full role in a transaction manifest when we instantiate the component. To do that we'll use some new Manifest Value Syntax (https://github.com/gguuttss/radix-docs/blob/master/reference/sbor-serialization/manifest-sbor/manifest-value-syntax.md) , instead of the rule! shorthand, that works for Scrypto but doesn't in manifests. This will give us a function call that looks something like this: CALL_FUNCTION Address("") "Radiswap" "new" Enum( Enum( Enum( Enum( Enum( Address("") ) ) ) ) ) Address("") Address("") Address("") ; Though for the rare case of no owner we could just put: CALL_FUNCTION Address("") "Radiswap" "new" Enum() Address("") Address("") Address("") Adding a Dapp Definition Account Address in a Transaction Manifest The second new instantiation argument is the dapp definition account address. Adding a this address as an argument allows us to add it as metadata for the component now, rather than in the Developer Console later. In Set Verification Metadata (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-set-verification-metadata.md) we customised our metadata in the Developer Console (https://stokenet-console.radixdlt.com/configure-metadata) , but if we already know what we want it to be, we can add it at instantiation, e.g. pub fn new( owner_role: OwnerRole, resource_address1: ResourceAddress, resource_address2: ResourceAddress, dapp_definition_address: ComponentAddress, ) -> Global { // --snip-- .instantiate() .prepare_to_globalize(owner_role.clone()) .with_address(address_reservation) .metadata(metadata!( init { // --snip-- "dapp_definition" => dapp_definition_address, updatable; } )) .globalize(); Adding the dapp definition account address as metadata in the instantiation manifest will look like this: CALL_FUNCTION Address("") "Radiswap" "new" // --snip-- Address("") ; Updating Metadata You can find more information about setting and updating metadata in the Entity Metadata (../../reference/radix-engine/metadata/entity-metadata.md#configuring-metadata-roles) section of the documentation. This changes the steps to instantiate the package component on Stokenet. You can find those updated steps in the Using the Radiswap Front End on Stokenet (run-the-radiswap-dapp.md#using-the-radiswap-front-end-on-stokenet) Setup section below. Event Emission Events in Scrypto (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-events.md) are a way to communicate to off chain clients. They are emitted by the component and can be listened for to begin secondary actions with the Gateway (../../integrate/network-apis/README.md#gateway-api) or Core (../../integrate/network-apis/README.md#core-api) APIs. There are many events that already exist in the core components. You may have noticed these in transaction receipts on resim. In the Radiswap component we also emit custom events when different methods are called. For example a SwapEvent, which contains the amount of each resource swapped: #[derive(ScryptoSbor, ScryptoEvent)] pub struct SwapEvent { pub input: (ResourceAddress, Decimal), pub output: (ResourceAddress, Decimal), } Is emitted whenever the swap method is called: pub fn swap(&mut self, input_bucket: Bucket) -> Bucket { // --snip-- Runtime::emit_event(SwapEvent { input: (input_bucket.resource_address(), input_bucket.amount()), output: (output_resource_address, output_amount), }); For these events to be emitted successfully, they all need to be declared in a #[events(...)] attribute at the start of the blueprint: #[blueprint] #[events(InstantiationEvent, AddLiquidityEvent, RemoveLiquidityEvent, SwapEvent)] mod radiswap { As no part of a transaction will succeed if any of it fails, events will not be emitted if the transaction does not complete successfully. The Radiswap Front End The Radiswap front end is a single web page that allows anyone with a Radix Wallet to interact with Radiswap component on ledger. Its HTML defines several buttons and text inputs and runs the client/main.js script, where interactions with the Radix network and wallet take place. These types of front end interactions were previously described in more detail in both the Run Your First Front End Dapp (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-run-your-first-front-end-dapp.md) and Run the Gumball Machine Front End dApp (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-run-the-gumball-machine-front-end-dapp.md) sections of the documentation. They are summarised again here. In client/main.js we use the radix-dapp-toolkit (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-toolkit.md) and gateway-api-sdk (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/gateway-sdk.md) to interact with the Radix network. import { RadixDappToolkit, // --snip-- } from "@radixdlt/radix-dapp-toolkit"; import { GatewayApiClient } from "@radixdlt/babylon-gateway-api-sdk"; A connection to the Radix Wallet and Network is established using the Radix dApp Toolkit, Gatway API and a Dapp Definition (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-definition-setup.md) : const dAppDefinitionAddress = "_YOUR_DAPP_DEFINITION_ACCOUNT_ADDRESS_"; // --snip-- // Create a dapp configuration object for the Radix Dapp Toolkit and Gateway API const dappConfig = { networkId: RadixNetwork.Stokenet, applicationVersion: "1.0.0", applicationName: "Hello Token dApp", applicationDappDefinitionAddress: dAppDefinitionAddress, }; // Instantiate Radix Dapp Toolkit to connect to the Radix wallet const rdt = RadixDappToolkit(dappConfig); // Instantiate Gateway API client to query the Radix network const gatewayApi = GatewayApiClient.initialize(dappConfig); With this and the Radiswap component address, the front end can then get information about the components state via the gateway API, e.g. const componentDetails = await gatewayApi.state.getEntityDetailsVaultAggregated(componentAddress); As well as connect to and request details from a Radix Wallet: rdt.walletApi.setRequestData(DataRequestBuilder.accounts().exactly(1)); With a connection to the wallet established, the dapp can then use the transition manifests generation functions in the client/manifests directory to interact with the Radiswap component. For example, to swap resources in the pool: import { // --snip-- getSwapManifest, } from "./manifests"; // --snip-- swapButton.onclick = async function () { const manifest = getSwapManifest({ accountAddress: account.address, resourceAddress: swapTokenInput.value, amount: swapAmountInput.value, componentAddress, }); const result = await rdt.walletApi.sendTransaction({ transactionManifest: manifest, version: 1, }); if (result.isErr()) throw result.error; This covers most of the types of front end functionality, but the commented code (https://github.com/radixdlt/official-examples/blob/main/step-by-step/21-radiswap-dapp/client/main.js) is waiting to be explored, if you're looking for more detail. Using Radiswap There are two described ways you can use Radiswap, both of which you'll find instructions for in our official examples on Github. - Using the Radiswap Scrypto package in resim (https://github.com/radixdlt/official-examples/blob/main/step-by-step/21-radiswap-dapp#using-the-radiswap-scrypto-package-in-resim) will show you how to use Radiswap locally, in the Radix Engine Simulator - Using the Radiswap front end on Stokenet (https://github.com/radixdlt/official-examples/tree/main/step-by-step/21-radiswap-dapp#using-the-radiswap-front-end-on-stokenet) shows you how to deploy the scrypto package to the test network, instantiate a new Radiswap component, then connect that to a useable locally running front end. ## Test a Multi-Blueprint Package URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-test-a-multi-blueprint-package Updated: 2026-02-18 Summary: This topic is still a work in progress Build > Learning Step-by-Step > Test a Multi-Blueprint Package — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-test-a-multi-blueprint-package.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. The last section introduced us to testing blueprints. It's now time to apply those lessons to a more complex blueprint package; our familiar favourite, the Candy Store. This will show us how to test larger packages with multiple blueprints and methods. We'll add tests to the version of the Candy Store and Gumball Machine blueprints we created in the Create Owned Components (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-create-owned-components.md) section. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/20-candy-store-tests) . Testing the Candy Store Package We can have multiple test files for a Scrypto package by placing them in the tests/ directory. We've done this with the separate modules of our package which each have their own test files, tests/candy_store.rs and tests/gumball_machine.rs. To import the modules into their test files we need to add the pub keyword to module exports in /src/lib.rs. This makes them accessible for testing: pub mod candy_store; pub mod gumball_machine; The two modules in the package are nicely separated in ways that make a different testing approach preferable for each. The gumball_machine module is a good candidate for unit testing, while the candy_store module is better for integration testing. Unit Testing the Gumball Machine Module Most of the logic of our package is in the gumball_machine module. To test it we use the Scrypto-Test framework to write unit tests. First we add the scrypto-test, radix-engine-interface crates to the Cargo.toml file dev-dependencies, as well as candy-store itself with the test feature: [dev-dependencies] # --snip-- scrypto-test = { git = "https://github.com/radixdlt/radixdlt-scrypto", tag = "v1.1.1" } radix-engine-interface = { git = "https://github.com/radixdlt/radixdlt-scrypto", tag = "v1.1.1" } candy-store = { path = ".", features = ["test"] } Then we add the required imports to tests/gumball_machine.rs: use radix_engine_interface::prelude::*; use scrypto::this_package; use scrypto_test::prelude::*; use candy_store::gumball_machine::test_bindings::*; We can now write tests for the gumball_machine module. We'll start with a helper function to arrange the test environment: fn arrange_test_environment( price: Decimal, ) -> Result<(TestEnvironment, GumballMachine), RuntimeError> { let mut env = TestEnvironment::new(); let package_address = Package::compile_and_publish(this_package!(), &mut env)?; let (gumball_machine, _owner_badge) = GumballMachine::instantiate_global(price, package_address, &mut env)?; Ok((env, gumball_machine)) } This function creates a new TestEnvironment, compiles and publishes the package, and instantiates a GumballMachine with the given price. It exits to stop us needing to repeat code as we can use this as part or all of the setup for the tests that follow. The first test checks that the GumballMachine can be instantiated, by just running our helper function. If it doesn't panic, the test passes. #[test] fn can_instantiate_gumball_machine() -> Result<(), RuntimeError> { let (_env, _gumball_machine) = arrange_test_environment(dec!(1))?; Ok(()) } After this, the rest of our unit tests follow a similar pattern. With clear arrange, act, and assert sections. For example, we can test that the GumballMachine can be refilled: #[test] fn can_refill_gumball_machine() -> Result<(), RuntimeError> { // Arrange let (mut env, mut gumball_machine) = arrange_test_environment(dec!(10))?; let payment = BucketFactory::create_fungible_bucket(XRD, dec!(100), Mock, &mut env)?; let _ = gumball_machine.buy_gumball(payment, &mut env)?; env.disable_auth_module(); // Act gumball_machine.refill_gumball_machine(&mut env)?; // Assert let status = gumball_machine.get_status(&mut env)?; assert_eq!(status.amount, dec!(100)); Ok(()) } Here we; - arrange the environment and gumball machine with our helper function, the BucketFactory to create a payment bucket and buy a gumball, - act by refilling the gumball machine, - and assert that the amount of gumballs in the machine is now back to 100. Bucket Factories BucketFactory (../scrypto-1/testing/scrypto-test.md#creation-of-buckets-and-proofs) is a part of the scrypto-test framework, used to create buckets for testing. Integration Testing the Candy Store Module As the candy_store module is the globalised part of the package, any transaction manifests addressing the package will interact with a Candy Store component (or blueprint when instantiating a component) first. As it's the entry point for all method calls it's much better suited to integration testing than the gumball_machine module. The Scrypto Test Runner works by generating and running transaction manifests, so it too is ideal for integration testing. To use it in the tests/candy_store.rs file we first need to add the scrypto-unit and radix-engine-interface crates to the Cargo.toml file dev-dependencies: [dev-dependencies] # --snip--- scrypto-unit = { git = "https://github.com/radixdlt/radixdlt-scrypto", tag = "v1.1.1" } # --snip--- radix-engine-interface = { git = "https://github.com/radixdlt/radixdlt-scrypto", tag = "v1.1.1" } Then we add the required imports to tests/candy_store.rs: use radix_engine_interface::prelude::*; use scrypto::this_package; use scrypto_test::prelude::*; use scrypto_unit::*; We can then start to write our test with a LedgerSimulator (simulated ledger for testing) created with LedgerSimulatorBuilder: let mut ledger = LedgerSimulatorBuilder::new().build(); Any test we write will need to emulate the way we interact with the Candy Store on the Radix network, publishing the package, using it to instantiate a CandyStore and then calling methods on them with transaction manifests. you can see some of this demonstrated at the start of the test: // Create a new account with associated public and private keys. let (public_key, _private_key, account_address) = ledger.new_allocated_account(); // Compile and publish the CandyStore blueprint package. let package_address = ledger.compile_and_publish(this_package!()); // ----------------- Instantiate the CandyStore ----------------- // Build a manifest to instantiate the CandyStore, including initial price argument. let gumball_price = dec!(10); let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_function( package_address, "CandyStore", "instantiate_candy_store", manifest_args!(gumball_price), ) .deposit_batch(account_address) .build(); // Execute the manifest, obtaining a transaction receipt. let receipt = ledger.execute_manifest( manifest, vec![NonFungibleGlobalId::from_public_key(&public_key)], ); println!( "instantiate_candy_store Transaction Receipt:\n{}", receipt.display(&AddressBech32Encoder::for_simulator()) ); // Assert that the transaction commits successfully // If the transaction is unsuccessful, the test will fail here receipt.expect_commit_success(); The LedgerSimulator (https://docs.rs/scrypto-test/latest/scrypto_test/ledger_simulator/struct.LedgerSimulator.html) has various methods to arrange and execute in the simulated ledger, like new_allocated_account() and execute_manifest while the ManifestBuilder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) is used to create transaction manifests inside a rust file. These allow us to test a variety of interactions with the Candy Store and be sure that none will fail. Have a closer look at the file to see more of how this works. Using Candy Store Tests Running tests on this package is simple just follow the instructions here in our Official Examples on GitHub (https://github.com/radixdlt/official-examples/tree/main/step-by-step/20-candy-store-tests#using-candy-store-tests) Closing thoughts This section shows how to test a Scrypto package with both unit and integration tests, but there's room for more thorough testing even here. We could add more tests to cover more edge cases and check for where we should see more failed transactions rather than just successful ones. There is also a lot of repeated code in these tests, which is useful to see how they work but could be reduced with helper functions. scrypto-unit and scrypto-test are powerful tools for testing Scrypto packages, and it's worth exploring the documentation (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/testing/scrypto-test.md) more to see what else it can do. ## Explain Your First Test URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-explain-your-first-test Updated: 2026-02-18 Summary: It's time to focus on testing. Thorough testing is essential to ensuring the proper predictable working of any Scrypto packages we write. You may have noticed a Build > Learning Step-by-Step > Explain Your First Test — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-explain-your-first-test.md) It's time to focus on testing. Thorough testing is essential to ensuring the proper predictable working of any Scrypto packages we write. You may have noticed an example of this in the Hello template used in several previous sections. It has a test/ directory that holds a lib.rs file containing two test functions. These demonstrate two ways to test the Hello blueprint and the two main ways to test any Scrypto package. Here we'll explain both and show you how to run the tests. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/19-hello-test) . Testing Blueprints and Modules There are two ways to test blueprints and modules, the Ledger Simulator and Test Environment. LedgerSimulator is better suited for integration testing, while TestEnvironment is more ideal for unit testing. Ledger Simulator The Ledger Simulator (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/testing/scrypto-test.md) is an in-memory ledger simulator. Tests interact with the simulator as a user submitting transactions to the network would. This is great for integration and end-to-end testing. To test in Scrypto, we import scrypto_test::prelude::* and the test version of our blueprint. In our case that's the hello blueprint imported with use hello_test::hello_test, hello_test is the package name followed by the blueprint name, appended with _test for the test version of said blueprint. These are imported at the top of our test file: use scrypto_test::prelude::*; use hello_test::hello_test::*; Test Module names For testing you need to import the test version of packages, appended with _test. e.g. to test example_blueprint in the example_package package you would import it with: use example_package::example_blueprint_test::* To make this import work, we need to add scrypto_test to the Cargo.toml file: [dev-dependencies] scrypto-test = { version = "1.2.0" } Where we also need to make sure the test feature is enabled: [features] default = [] test = [] Then we can create our simulated ledger. In our case that's back in the test/lib.rs file inside the test_hello function: #[test] fn test_hello() { // Setup the ledger let mut ledger = LedgerSimulatorBuilder::new().build(); In that environment, we create an account: // Create an account let (public_key, _private_key, account) = ledger.new_allocated_account(); We then need the package available in the environment: // Publish package let package_address = ledger.compile_and_publish(this_package!()); Once we have the package we can test the instantiate function. This is done by: - Building a manifest with the the ManifestBuilder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) : let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_function( package_address, "Hello", "instantiate_hello", manifest_args!(), ) .build(); - Submitting the manifest to the ledger: let receipt = ledger.execute_manifest( manifest, vec![NonFungibleGlobalId::from_public_key(&public_key)], ); - Checking the manifest receipt to see if it successfully instantiated a new component, then storing the component address for later use if it did: let component = receipt.expect_commit(true).new_component_addresses()[0]; With the component in our test environment and its address, we can now test the free_token method. A similar 3 steps are followed, but with a different manifest: - Build a manifest: let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_method(component, "free_token", manifest_args!()) .call_method( account, "deposit_batch", manifest_args!(ManifestExpression::EntireWorktop), ) .build(); - Submit the manifest to the ledger: let receipt = ledger.execute_manifest( manifest, vec![NonFungibleGlobalId::from_public_key(&public_key)], ); - Check the manifest receipt to see if it was successful: receipt.expect_commit_success(); We do not need to check the return value of the free_token method as we are testing the ledger interaction, not the logic of the method. If the method returns an error, the test will fail. Testing the logic of the method is more easily done with TestEnvironment. Test Environment The Test Environment (https://docs.rs/scrypto-test/latest/scrypto-test/) framework is different to the Ledger Simulator. Instead of interacting with the ledger as a user, tests interact as native blueprints. This removes the need for transaction manifests and opens up some extra options unavailable with LedgerSimulator. These differences make it better suited for unit testing the logic of a blueprint. Testing our Hello blueprint with TestEnvironment is done with the same test import modules: use scrypto_test::prelude::*; use hello_test::hello_test::*; Meaning scrypto-test is still needed in our Cargo.toml file's dev-dependencies, with the test feature enabled: [dev-dependencies] scrypto-test = { version = "1.2.0" } # --snip-- [features] default = [] test = [] We'll use TestEnvironment to test the free_token method output with a AAA testing pattern: Arrange, Act, Assert. In our test/lib.rs file, with the modules imported we create a new environment and arrange the conditions for our test by publishing our package and instantiating a new Hello component from it - no manifest required: // Arrange let mut env = TestEnvironment::new(); let package_address = PackageFactory::compile_and_publish(this_package!(), &mut env)?; let mut hello = Hello::instantiate_hello(package_address, &mut env)?; This allows us to then perform the action we want to test by calling the method: // Act let bucket = hello.free_token(&mut env)?; The method returns whatever it would on ledger; in this case a bucket. We can now check the amount of tokens in the bucket is what we expect with an assertion: // Assert let amount = bucket.amount(&mut env)?; assert_eq!(amount, dec!("1")); If the assertion is incorrect the test will panic and the test will fail. If the assertion is correct we can return an Ok (containing an empty value): Ok(()) If you're wondering about the new syntax, TestEnvironment uses Result (https://doc.rust-lang.org/std/result/) return types for error handling, so we can use the ? operator to propagate errors up the call stack, and OK to return the function values. In our case we're just returning Ok(()), with an empty value, to indicate the test passed and propagated errors are handled by the test framework. Running the Tests Running tests on a scrypto package is simple just follow the instructions here in our Official Examples on GitHub (https://github.com/radixdlt/official-examples/tree/main/step-by-step/19-hello-test#running-the-tests) ## Use External Components URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-use-external-components Updated: 2026-02-18 Summary: With the right access, components on the Radix ledger can contact and call methods on global components instantiated from other packages. This section shows us Build > Learning Step-by-Step > Use External Components — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-use-external-components.md) With the right access, components on the Radix ledger can contact and call methods on global components instantiated from other packages. This section shows us how to make those external calls with our now very familiar Candy Store and Gumball Machine components. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/18-candy-store-external-component) . External Components There are many methods to use external components in Scrypto (../scrypto-1/advanced-external-calls.md#calling-a-specific-blueprint-or-global-component-of-your-package) . Here we show you one of the simpler of these ways. There are two main steps: First, we use the extern_blueprint! macro to import the external blueprint into our own. This process is the same as in the previous section, but this time we won't instantiate the external component in our package. extern_blueprint! { // import the GumballMachine package from the ledger using its package address "", GumballMachine { // --snip-- } } Second, we store the external component's address (and owner badge for non public method calls) in our in our new component's state. struct CandyStore { gumball_machine_owner_badge: Vault, gumball_machine_address: Global, } The important part here is the address is stored as the component's type, Global. Component types are all addresses in the Radix Engine. By applying the type, we can now call the methods on the external component described in our extern_blueprint! macro, e.g. pub fn buy_gumball(&mut self, payment: Bucket) -> (Bucket, Bucket) { // buy a gumball self.gumball_machine_address.buy_gumball(payment) } This combination of importing the external blueprint and storing the component address in our component's state allows us to call an external component's methods from within our component. Authorizing Calls Between Components In previous sections, this logic is abstracted away in the authorize_with_amount method. Here we explain and see the process in more detail. When we call restricted methods in one component from another, we need to prove we have authorization for the inner component. Proving ownership of badges only works per component or resource. This is to avoid the possibility of accidentally escalating permissions by providing unintended authorization. This means that when one method calls another method requiring authorization on a separate component, a proof needs to be placed on a local authorization zone for the second component. e.g. pub fn set_gumball_price(&mut self, new_price: Decimal) { // create a proof of the gumball machine owner badge let gumball_machine_owner_badge_proof = self .gumball_machine_owner_badge .as_fungible() .create_proof_of_amount(1); // place the proof on the local auth zone, so methods called within this method are authorized by it LocalAuthZone::push(gumball_machine_owner_badge_proof); // set the gumball machine's price, authorized by the gumball machine owner badge proof. self.gumball_machine_address.set_price(new_price); } The proof is created so the badge doesn't need to be removed from its vault and passed around. It can be placed wherever we need to prove ownership of the badge and only exists for the duration of the transaction. More about proofs and the Authorization Zone can be found in the Call a Protected Method/Function documentation section (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/call-a-protected-method-function.md) . Using the Candy Store and External Gumball Machine You can try external cross component calls for yourself by following the instructions in the official examples GitHub here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/18-candy-store-external-component#using-the-candy-store-and-external-gumball-machine) ## Use External Blueprints URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-use-external-blueprints Updated: 2026-02-18 Summary: Blueprints and components do not exist on the radix network in isolation. They are addressable and can interact with each other if allowed. In this section we s Build > Learning Step-by-Step > Use External Blueprints — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-use-external-blueprints.md) Blueprints and components do not exist on the radix network in isolation. They are addressable and can interact with each other if allowed. In this section we show you how to allow and enable blueprint to blueprint interactions other than transactions. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/17-candy-store-external-blueprint) . External Blueprints As the Radix ecosystem continues to mature there will be more and more blueprints available to use. These blueprints are created by many different teams and individuals and some are intended for external use, with the possibility of royalties (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/royalties/using-royalties.md) given to the blueprint owner. You may work on some of these yourself. Using external blueprints (../scrypto-1/advanced-external-calls.md#calling-a-specific-blueprint-or-global-component-of-your-package) is a little different to using your own. We need to import the blueprint into our own using the extern_blueprint! macro. This macro describes the external blueprint with its address, name, function and method signatures. mod candy_store { extern_blueprint! { // import the GumballMachine package from the ledger using its package address "", GumballMachine { // Blueprint Functions fn instantiate_global(price: Decimal) -> ( Global, Bucket); fn instantiate_owned(price: Decimal, component_address: ComponentAddress) -> Owned; // Component Methods fn get_status(&self) -> Status; fn buy_gumball(&mut self, payment: Bucket) -> (Bucket, Bucket); fn set_price(&mut self, price: Decimal); fn withdraw_earnings(&mut self) -> Bucket; fn refill_gumball_machine(&mut self); } } When publishing the package, the package address is checked for an existing blueprint. Publishing will fail if none exists. The description in the macro gives us access to the external blueprint's functions in ours. In our Candy Store blueprint we use this to instantiate a component from a separately and previously published Gumball Machine package. let gumball_machine = Blueprint::::instantiate_owned(gumball_price, component_address); Note the Blueprint:: type. This is generated by the extern_blueprint! macro. The syntax is a little odd if you have not worked with Rust before, but is hopefully clear. The new Gumball Machine component is now owned by our Candy Store giving it access to the Gumball Machine's methods. Multiple instantiation functions So as to allow for both an owned and global gumball machine, we have two instantiation functions. instantiate_owned: pub fn instantiate_owned( price: Decimal, component_address: ComponentAddress, ) -> Owned { // --snip-- .instantiate() } And instantiate_global: pub fn instantiate_global(price: Decimal) -> (Global, Bucket) { // --snip-- let gumball_machine = Self::instantiate_owned(price, component_address) // assign the component owner role to the possessor of the owner_badge resource .prepare_to_globalize(OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) // apply the address reservation .with_address(address_reservation) .globalize(); (gumball_machine, owner_badge) } This is a good demonstration of component instantiation producing owned components initially, which can subsequently be globalized. instantiate_global calls instantiate_owned then globalizes the returned component. We use it here, so the same version of the blueprint can also be used in this and the next section. Having multiple instantiation functions can serve several other purposes as well. Allowing for a standard default component to be created, while also allowing for a more complex or customized component versions, potentially using input arguments to decide on metadata or more complex access rules (../scrypto-1/scrypto-design-patterns/reusable-blueprints-pattern.md#multiple-instantiation-functions) . Using the Candy Store with an External Gumball Machine To try creating the connection between blueprint packages yourself, follow the instructions in the official examples GitHub here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/17-candy-store-external-blueprint#using-the-candy-store-with-an-external-gumball-machine) ## Create Owned Components URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-create-owned-components Updated: 2026-02-18 Summary: In the last section we looked at a candy store package made up of several blueprints. This section will show how to do the same thing in a different way. We wil Build > Learning Step-by-Step > Create Owned Components — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-create-owned-components.md) In the last section we looked at a candy store package made up of several blueprints. This section will show how to do the same thing in a different way. We will still have a candy store component containing a gumball machine component. The difference this time, will be the gumball machine will be owned by the candy store. There are two broad ways to modularise your components, each with distinct advantages. - Global components: Like all the components from the previous sections, these are created at the global level and are accessible to all other components on the ledger. - Owned components: Internal to other components, they are only accessible to their parent components. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/16-candy-store-owned-modules) . Global and Owned Components All components are initially local. In this state they are not addressable by or accessible to others. To change this we globalize them. This is done after instantiation by first calling prepare_to_globalize (setting the component access rules and reserving an address) on the new component, then calling the globalize method like so: .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize(); Without these steps, to use a component it must be internal to another. This is done by adding the component in a parent component's struct, with the wrapping Owned type, e.g. struct CandyStore { gumball_machine: Owned, } The owned component's methods, can then be called by it's parent, but no other components. e.g. self.gumball_machine.buy_gumball(payment); Let's compare how this looks in our example packages. Package with Owned GumballMachine Package with Global GumballMachine Owned vs Global GumballMachine The global GumballMachine is the same as from previous examples. The owned version is where we see several changes. Restricted methods Owned Global - No method restrictions. This is now handled by the parent component. enable_method_auth! { methods { buy_gumball => PUBLIC; get_status => PUBLIC; set_price => restrict_to: [OWNER]; withdraw_earnings => restrict_to: [OWNER]; refill_gumball_machine => restrict_to: [OWNER]; }} Instantiation function start Owned Global - The parent component address is passed in. - Only the new component is returned as there's no owner badge. pub fn instantiate_gumball_machine( price: Decimal, parent_component_address: ComponentAddress, ) -> Owned { - A component address reserved. - Both the component and owner badge are returned. pub fn instantiate_gumball_machine( price: Decimal ) -> (Global, Bucket) { // reserve an address for the component let ( address_reservation, component_address, ) = Runtime::allocate_component_address( GumballMachine::blueprint_id() ); Owner badge Owned Global - No owner badge. The parent component is the owner. let owner_badge = ... Gumball mint roles Owned Global .mint_roles(mint_roles! { minter => rule!(require( global_caller( parent_component_address ))); minter_updater => rule!(deny_all); }) .mint_roles(mint_roles! { minter => rule!(require( global_caller( component_address ))); minter_updater => rule!(deny_all); }) Instantiation function end Owned Global .instantiate() .instantiate() .prepare_to_globalize(OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) .with_address(address_reservation) .globalize(); The owned version is simpler, as we're able to remove much of the access and control code. The possible downside is the owned component no longer has a global address so cannot be accessed by components other than it's owner, such as those in other packages. For some purposes this is ideal though. Owner vs Non-owner CandyStore The CandyStore is globalized in both (this and the previous sections) version of our package. The code for its blueprint is simpler here, where it owns the GumballMachine. In this version there's no need for a GumballMachine owner badge stored in this component. The only methods accessible on either blueprint are those of the CandyStore, so we restricted access to necessary methods on the CandyStore and we don't need add restrictions to the GumballMachine. No method restrictions in the GumballMachine means no needs to hold the GumballMachine owner badge to pass proof of ownership to the GumballMachine to call it's restricted methods. This is done with the authorize_with_amount method for the previous global GumballMachine, which we don't have to use at all in this version. Component state Owner (owned GumballMachine) Non-owner (global GumballMachine) struct CandyStore { gumball_machine: Owned, } struct CandyStore { gumball_machine: Global, gumball_machine_owner_badges: Vault, } Address reservation Owner (owned GumballMachine) Non-owner (global GumballMachine) let (address_reservation, component_address) = Runtime::allocate_component_address( CandyStore::blueprint_id() ); - None. The GumballMachine address and owner badge are used to restrict access to component methods and token behaviours. Gumball machine instantiation Owner (owned GumballMachine) Non-owner (global GumballMachine) let gumball_machine = GumballMachine::instantiate_gumball_machine( gumball_price, component_address, ); let ( gumball_machine, gumball_machine_owner_badge, ) = GumballMachine::instantiate_gumball_machine( gumball_price ); Globalizing Owner (owned GumballMachine) Non-owner (global GumballMachine) .instantiate() .prepare_to_globalize( OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) .with_address(address_reservation) .globalize(); .instantiate() .prepare_to_globalize( OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) .globalize(); Calling restricted GumballMachine methods Owner (owned GumballMachine) Non-owner (global GumballMachine) pub fn set_gumball_price( &mut self, new_price: Decimal ) { self.gumball_machine.set_price( new_price ); } - To call a method on the GumballMachine we need to pass a proof that we have it's owner badge. pub fn set_gumball_price( &mut self, new_price: Decimal ) { self.gumball_machine_owner_badges .as_fungible() .authorize_with_amount( 1, || self.gumball_machine.set_price( new_price )); } Owned components make for simpler code that is easier to read and maintain. But you will have to decide if removing global accessibility of some component works for your applications. Using the Candy Store Instructions on how to setup and use the multi-blueprint yourself can be found in the Official Example GitHub repository (https://github.com/radixdlt/official-examples/tree/main/step-by-step/16-candy-store-owned-modules#using-the-candy-store) ## Build a Multi-Blueprint Package URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-build-a-multi-blueprint-package Updated: 2026-02-18 Summary: Up until now we've had packages of only one blueprint. In this section we will build a package with more. A candy store containing a single gumball machine (and Build > Learning Step-by-Step > Build a Multi-Blueprint Package — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-build-a-multi-blueprint-package.md) Up until now we've had packages of only one blueprint. In this section we will build a package with more. A candy store containing a single gumball machine (and nothing else) will be represented with blueprints that, when instantiated, will become two components. There are two broad ways to do this, with distinct advantages. This section covers a version using only global components. The other version, using owned components, is covered in the next section. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/15-candy-store-modules) . Modular Packages Using several blueprints in one package (../scrypto-1/scrypto-design-patterns/reusable-blueprints-pattern.md#small-modular-reusable-blueprints) is a common pattern with advantages of making it; - easier to manage and upgrade parts of your application, - more secure, by limiting the scope of each component, - easier to test and debug, - easier to reuse components in other packages. For these reasons it's normally a good idea to split your application into several blueprints each in their own file, though it's not always necessary. This is done by placing each blueprint in it's own file in the src/ directory, alongside the lib.rs file. For example: src/ ├── candy_store.rs ├── gumball_machine.rs └── lib.rs lib.rs is the starting point of a scrypto package. Any blueprints modules in the package not in lib.rs it's self must be added using the mod keyword, like so: mod candy_store; mod gumball_machine; For a blueprint that directly uses another, we also need to import it into the blueprint's file. For example, the CandyStore blueprint uses the GumballMachine, so we import it at the top of candy_store.rs, with the use keyword: use crate::gumball_machine::gumball_machine::*; There are then two ways to for us to prepare our components to work together. We can globalize them, or we can make one owned by the other. Each requires a different setup to keep methods and tokens accessible and secure. Let's look at the global version. Modular Package Blueprints Our package has two blueprints, CandyStore and GumballMachine. The GumBallMachine blueprint The global GumballMachine remains the same as in previous sections. - Some of it's methods are restricted to its owner in the enable_method_auth! macro at the top of the blueprint. enable_method_auth! { methods { buy_gumball => PUBLIC; get_status => PUBLIC; set_price => restrict_to: [OWNER]; withdraw_earnings => restrict_to: [OWNER]; refill_gumball_machine => restrict_to: [OWNER]; } } - The component's address is reserved for token access rules. pub fn instantiate_gumball_machine(price: Decimal) -> (Global, Bucket) { // reserve an address for the component let (address_reservation, component_address) = Runtime::allocate_component_address( GumballMachine::blueprint_id() ); - An owner badge is created. This will later be stored in the CandyStore, so it can call restricted methods on the GumballMachine. let owner_badge = ... - Mint roles are set using the components reserved address ensuring that only the GumballMachine can mint new tokens. .mint_roles(mint_roles! { minter => rule!(require( global_caller(component_address) )); minter_updater => rule!(deny_all); }) - Proof of the owner badge is made the required authorization for ownership and the address reservation is applied to the new component. .instantiate() .prepare_to_globalize(OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) .with_address(address_reservation) .globalize(); The CandyStore blueprint Our CandyStore has been simplified in comparison to the last section, by removing the custom auth roles, candy and chocolate eggs. It now contains a GumballMachine, but it does now have some new complexity as it needs to hold the GumballMachine owner badge and pass proof of that ownership back to the GumballMachine when calling it's restricted methods. This is done with the authorize_with_amount method. - The component state holds the GumballMachine and a Vault containing the GumballMachine owner badge. struct CandyStore { gumball_machine: Global, gumball_machine_owner_badges: Vault, } - The gumball machine is instantiated as a part of the candy store's own instantiate function. let (gumball_machine, gumball_machine_owner_badge) = GumballMachine::instantiate_gumball_machine( gumball_price ); - To call the GumballMachine's public methods we can simply call them on the CandyStore's internal gumball_machine. pub fn buy_gumball(&mut self, mut payment: Bucket) -> (Bucket, Bucket) { self.gumball_machine.buy_gumball(payment) } - To call a restricted method on the GumballMachine we need to pass a proof that we have it's owner badge by calling authorize_with_amount on the vault containing it. pub fn set_gumball_price(&mut self, new_price: Decimal) { self.gumball_machine_owner_badges .as_fungible() .authorize_with_amount( 1, || self.gumball_machine.set_price(new_price) ); } Authorize with Amount authorize_with_amount is a Vault method allows the use it's contents as a proof for a function call. The amount of tokens used as the proof is specified by the first argument. The second argument is a closure (anonymous function) that will be called with the proof. For non-fungibles, the equivalent method is authorize_with_non_fungibles. Using the Candy Store Instructions on how to setup and use the multi-blueprint yourself can be found in the Official Example GitHub repository (https://github.com/radixdlt/official-examples/tree/main/step-by-step/15-candy-store-modules#using-the-candy-store) . Final Thoughts All the CandyStore methods correspond to ones on GumballMachine. This makes it little more than a wrapper for the GumballMachine. However, you can easily imagine a more complex example where the candy store contains multiple gumball machines, all instantiated from the same blueprint. It could even contain multiple types of products again, with blueprints for each category of product with similar properties. This type of modularity makes it easier to manage, expand and upgrade packages even as they grow in complexity. ## Make Recallable Badges URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-make-recallable-badges Updated: 2026-02-18 Summary: It's time to introduce some more resource behaviors. In this section, we will add the ability to recall and burn staff badges to our candy store (from Build a C Build > Learning Step-by-Step > Make Recallable Badges — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-make-recallable-badges.md) It's time to introduce some more resource behaviors. In this section, we will add the ability to recall and burn staff badges to our candy store (from Build a Candy Store (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-build-a-candy-store.md) ). We don't want staff that stop working for us to keep their badges and the access that gives them. We'll trigger these behaviours in a different way to previous sections. Instead of adding a new method the recall action will be described purely in a transaction manifest, but to allow this to happen we will need to add some new permissions to the staff badge resource. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/14-candy-store-with-recallable-badges) . Recallable and Burnable Resources By default tokens can neither be recalled nor are they burnable. Turning these behaviours on will allow us to both bring a token to us from another vault and burn/destroy it. If we want to add these behaviors we do it the same way the mintable behaviour (or any others) would be added, by including roles for each. let staff_badges_manager = // --snip-- .recall_roles(recall_roles! { recaller => rule!( require(owner_badge.resource_address()) || require(manager_badge.resource_address()) ); recaller_updater => rule!(deny_all); }) .burn_roles(burn_roles! { burner => rule!( require(owner_badge.resource_address()) || require(manager_badge.resource_address()) ); burner_updater => rule!(deny_all); }) .create_with_no_initial_supply(); The rules for these roles are a little different to the mint roles we added in previous examples. They accept either the owner or manager badges as authorisation not the component's address. Recall and burn therefore can't and won't be called by any of the component's methods. They will instead be called on the vault containing the staff badge and the recalled badge bucket respectively, shown in the recall_staff_badge.rtm transaction manifest here: RECALL_NON_FUNGIBLES_FROM_VAULT Address("") Array( NonFungibleLocalId("#1#"), ) ; And here: BURN_RESOURCE Bucket("staff_badge_bucket") ; This is another way we can interact with resources in the radix engine. If they have rules that allow it, we can call them directly from the transaction manifest. A full list of the available manifest actions can be found in the Manifest Instructions (/manifest-instructions) section of the docs. Making Staff Badges Non-Fungible We've also made the Candy Store staff badges non-fungible. This change from the previous section's Candy Store blueprint, allows us to assign the badges to specific staff members and more easily identify them. To make this change we added a struct for the staff badge non-fungible data: #[derive(NonFungibleData, ScryptoSbor, Clone)] struct StaffBadge { employee_number: u64, employee_name: String, } Then we changed the staff badge to create a non-fungible resource using the new struct and the new_integer_non_fungible method: let staff_badges_manager = ResourceBuilder::new_integer_non_fungible::(OwnerRole::None) // --snip-- .create_with_no_initial_supply(); Changing the staff badge creation to create_with_no_initial_supply() also means it now produces a ResourceManager instead of the Bucket mint_initial_supply() produces. There are a few more minor simplifications to the instantiate function you might notice that account for this. More significantly, the change to non-fungible staff badges means we need to change the minting method. It now takes 2 arguments, the name and number of the employee, which become the stored non-fungible data. The number is also used as the local ID for the non-fungible, so must be unique. The function is now: pub fn mint_staff_badge(&mut self, name: String, number: u64) -> Bucket { let staff_badge_bucket: Bucket = self.staff_badge_resource_manager.mint_non_fungible( &NonFungibleLocalId::integer(number), StaffBadge { employee_number: number, employee_name: name, }, ); staff_badge_bucket } Using the Recallable Badges Instruction on using the Candy Store and recallable badges are with the example code on GitHub (https://github.com/radixdlt/official-examples/tree/main/step-by-step/14-candy-store-with-recallable-badges#using-the-candy-store-with-recallable-badges) . ## Build a Candy Store URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-build-a-candy-store Updated: 2026-02-18 Summary: This is a good opportunity to introduce another new blueprint. This time we'll create a candy store with two products, candy tokens and chocolate egg non-fungib Build > Learning Step-by-Step > Build a Candy Store — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-build-a-candy-store.md) This is a good opportunity to introduce another new blueprint. This time we'll create a candy store with two products, candy tokens and chocolate egg non-fungibles that each have toys inside. We're also going to have not just an owner role, but manager and staff roles too, so the store owner doesn't have to run all the day to day of the store. This introduces a new concept, authorization roles, which we will provide an explanation for. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/13-candy-store) . Authorization Roles Authorization in Scrypto is handled with roles. Each component has 2 predefined roles, the Owner role and the Self role. In previous examples, we used the Owner role to restrict access to multiple methods on our gumball machines. Here we'll add two more custom roles, the Manger and the Staff roles. Adding Roles Additional roles are defined in the enable_method_auth! macro at the top of the blueprint code. We add the manager and staff roles here: enable_method_auth! { roles { manager => updatable_by: [OWNER]; staff => updatable_by: [manager, OWNER]; }, The component methods can then be restricted to one or more roles: methods { buy_candy => PUBLIC; buy_chocolate_egg => PUBLIC; get_prices => PUBLIC; set_candy_price => restrict_to: [manager, OWNER]; set_chocolate_egg_price => restrict_to: [manager, OWNER]; mint_staff_badge => restrict_to: [manager, OWNER]; restock_store => restrict_to: [staff, manager, OWNER]; withdraw_earnings => restrict_to: [OWNER]; } } Badge Creation Along with the Owner badge we now need a Manger badge and a Staff badges. The Manger badge is just like the Owner badge, but with name and symbol metadata of Manger Badge and MNG respectively. let manager_badge: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "Manger Badge", locked; "symbol" => "MNG", locked; } )) .divisibility(DIVISIBILITY_NONE) .mint_initial_supply(1) .into(); The Staff badge, as well as having it's own name and symbol, is mintable in case we hire more staff. To make it mintable we've set the minter rule to require the component address as proof. To do that we have to reserve an address for the component at the beginning of the instantiate function. let (address_reservation, component_address) = Runtime::allocate_component_address(CandyStore::blueprint_id()); The component_address can then be used in the minter rule. let staff_badge: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!( init { "name" => "Staff Badge", locked; "symbol" => "STAFF", locked; } )) .divisibility(DIVISIBILITY_NONE) .mint_roles(mint_roles! { // add component address to minter rule minter => rule!(require(global_caller(component_address))); minter_updater => rule!(deny_all); }) .mint_initial_supply(2) .into(); This means that only the instantiated CandyStore component can mint the Staff badge, which can be done with the mint_staff_badge method called from the component. This is the Virtual Badge pattern described in Use the Gumball Machine on Stokenet (/learning-to-use-the-gumball-machine-on-stokenet) . The same minting rule is applied to our candy and chocolate egg tokens. This Scrypto code means that we'll know that they must have come from our buy_candy and buy_chocolate_egg methods, as only the CandyStore component can mint them. Instantiation Now we have these new roles and rules, we need to instantiate the component with rules stating which proofs are required to fulfil which roles. We also need to give the new component it's address reserved at the beginning of the instantiation function. let component = Self { // --snip-- } .prepare_to_globalize(OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) // define required proofs for custom roles .roles(roles!( manager => rule!(require(manager_badge.resource_address())); staff => rule!(require(staff_badge.resource_address()));)) // apply the address reservation to the component .with_address(address_reservation) .globalize(); Candy Store Methods Buying a chocolate egg works the same as buying a gumball in previouse sections. You provide a payment and if it is more than the cost of an egg, you get one chocolate egg and change. pub fn buy_chocolate_egg(&mut self, mut payment: Bucket) -> (Bucket, Bucket) { Buying candy is a little different. You can buy as much candy as is currently in stock at once, if you have the tokens to pay for it. The payment is divided by the candy price and you receive that many candy tokens, plus any change. You get as much candy as you can afford for your XRD. Any change is returned with the candy pub fn buy_candy(&mut self, mut payment: Bucket) -> (Bucket, Bucket) { Checked Math The buy_candy method uses checked mathematical operations to prevent overflow which might lock a component. It is highly recommended that you do the same in any Decimal calculations. let candy_amount = payment .amount() .checked_div(self.candy_price) .unwrap() .checked_round(0, RoundingMode::ToZero) .unwrap(); See the Decimal Overflows (../build-dapps/before-you-release/code-hardening.md#pay-special-attention-to-decimal-operations) section of the docs for more information. There are also two methods to set the price of the candy and chocolate eggs. These are restricted to the manager and owner roles. pub fn set_candy_price(&mut self, new_price: Decimal) { pub fn set_chocolate_egg_price(&mut self, new_price: Decimal) { Minting staff badges is also restricted to the manager and owner roles. The method returns the minted badge rather than storing it in the component. pub fn mint_staff_badge(&mut self) -> Bucket { The restock_store method can be called by staff, manager or owner role holders. It now adds resources to two vaults, adding one of each type of egg and refilling the candy vault to 100. pub fn restock_store(&mut self) { let candy_amount = 100 - self.candy.amount(); self.candy .put(self.candy_resource_manager.mint(candy_amount)); let eggs = [ Egg { toy: Toy::Dinosaur }, Egg { toy: Toy::Unicorn }, Egg { toy: Toy::Dragon }, Egg { toy: Toy::Robot }, Egg { toy: Toy::Pony }, ]; // loop through eggs array for egg in eggs.iter() { self.chocolate_eggs.put( // mint a non-fungible for each egg self.chocolate_egg_resource_manager .mint_ruid_non_fungible(egg.clone()), ) } Using the Candy Store We've put instructions for using the Candy Store in our official examples on GitHub (https://github.com/radixdlt/official-examples/tree/main/step-by-step/13-candy-store#using-the-candy-store) ## Create Your First Non-Fungible URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-create-your-first-non-fungible Updated: 2026-02-18 Summary: So far this series has only focused on fungible resources. This will be our first look at non-fungibles. Build > Learning Step-by-Step > Create Your First Non-Fungible — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-create-your-first-non-fungible.md) So far this series has only focused on fungible resources. This will be our first look at non-fungibles. By making some small changes to our starting Hello template, we can create it with a non-fungible instead of a fungible resource. That resource can then be obtained with the same free_token method as we used in the original template. The code referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/12-hello-non-fungible) . Non-Fungible Resources Non-fungibles, like fungible resources, are native to Radix, so they behave as real world objects and have other behaviours guaranteed by the Radix Engine. What makes them unique is that they are unique, meaning non-fungible tokens minted from the same resource manager are not interchangeable. This is in contrast to fungible resources, where any token from the same resource manager is identical and interchangeable. If fungible resources are like money, non-fungibles are like collectibles. Non-fungibles have several different properties to fungibles on Radix. Firstly they have a unique identifier, their NonFungibleLocalID. This can be an integer, string, byte array or RUID (Radix Unique Identifier). It is used to identify the non-fungible within the collection's resource address and must be of the same type throughout the same collection. e.g. a non-fungible collection of playing cards could all have integer local IDs, or all have string local IDs, but not some with integers and some with strings. In our example we'll use RUIDs, as the Radix Engine will make it easy and generate them for us. For the other types, we would need to specify them ourselves (see below). Non-fungibles also have NonFungibleData, which can take the form of any data you require. Its structure will need to be defined outside of the blueprint. In our case the non-fungible data structure definition is just above the blueprint code. What's Changed To make the Hello example non-fungible, we need to make a few small changes to the Hello template created when you run scrypto new-package hello. Firstly, we need to add our NonFungibleData structure in the form of the Greeting struct. #[derive(ScryptoSbor, NonFungibleData)] pub struct Greeting { text: String, } Then we need to change our HelloToken resource to be non-fungible by changing the ResourceBuilder method used from new_fungible to new_ruid_non_fungible. let my_bucket: Bucket = ResourceBuilder::new_ruid_non_fungible(OwnerRole::None) Resource Builder methods ResourceBuilder methods include new_ruid_non_fungible, new_integer_non_fungible,new_string_non_fungible and new_bytes_non_fungible. These corresponding to their NonFungibleLocalID types of RUID (Radix Unique Identifier), integer, string or byte array. For clarity we also change the metadata so we have a new name and symbol for our resource. .metadata(metadata! { init { "name" => "HelloNonFungible", locked; "symbol" => "HNF", locked; } }) Finally, we need to mint our initial supply of non-fungibles. This is done by calling the mint_initial_supply method on our resource builder, but it now takes an array of NonFungibleData in the form described in our Greeting struct. .mint_initial_supply( [ Greeting { text: "Hello".into(), }, Greeting { text: "Pleased to meet you".into(), }, Greeting { text: "Welcome to Radix".into(), }, Greeting { text: "Salutations".into(), }, Greeting { text: "Hi there".into() }, ] ) Non-RUID Non-Fungible Local IDs A NonFungibleLocalID must be specified for a non-RUID non-fungible (integer, string or bytes) when minting. To do this state the local ID before the NonFungibleData in a tuple. e.g. if we create our non-fungibles with integer local IDs like so: let my_bucket: Bucket = ResourceBuilder::new_integer_non_fungible(OwnerRole::None) Then we would mint like this: .mint_initial_supply([ ( // NonFungibleLocalID IntegerNonFungibleLocalId::new(1), // NonFungibleData Greeting { text: "Hello world!".into(), }, ), ]) Running Hello Non-Fungible To run the updated Hello example, follow the steps described in our official examples on GitHub (https://github.com/radixdlt/official-examples/tree/main/step-by-step/12-hello-non-fungible#running-the-example) ## Set Verification Metadata URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-set-verification-metadata Updated: 2026-02-18 Summary: After getting our Gumball Machine dapp up and running on the Stokenet test network, there are still a few things that would stop it from working correctly on th Build > Learning Step-by-Step > Set Verification Metadata — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-set-verification-metadata.md) After getting our Gumball Machine dapp up and running on the Stokenet test network, there are still a few things that would stop it from working correctly on the main Radix network and in the Radix Wallet. This section will show you how to add metadata to your dapp definition, and link that to your on ledger component and front end, to get it working and displaying correctly on all versions of the network. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/11-gumball-machine-dapp-verification-data) . Two Way Linking You may have noticed when using example dapps in previous sections, when your wallet receives a transaction it appears to come from "1 Unknown Components". This is because the wallet cannot be sure it's the correct definition for that dapp; that someone hasn't created a fake dapp linking to a more credible definition. The way we solve this problem is two way linking. Two way linking involves linking a dapp definition to a component, resources and web address, and linking the component, resources and web address back to the dapp. Linking in both directions proves the validity of the relationships. For components, this means the wallet will use the dapp definition's name, image_url, and other metadata for identification, to show the user which dapp they're interacting with. For resources it allows their associated dapp to be displayed, so you know which machine your gumball came from. We add these links in the metadata of the entities. Metadata for Verification The system of dapp, component and resource verification (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-verification.md) involves a dapp definition and a collection of metadata fields for linking entities together. Each component can have a dapp_definition stored in metadata and each resource can be linked to multiple dapp_definitions. The dapp definition account will need to have all of these components and resources stored in claimed_entities metadata to complete the links. The dapp definition can also link to other dapp_definitions and claimed_websites. In the last case, the website will need to have a .well-known/radix.json file with the dapp definition address in it to complete the link. The full list, with data types for each, can be found in the Metadata for verification (../../reference/standards/metadata-standards/metadata-for-verification.md#metadata-standards-for-verification-of-onledger-entities) section of the documentation. Often, when components and resources are created, we won't have a dapp definition yet. Fortunately metadata can be updated by a component or resource owner by default (../../reference/radix-engine/metadata/entity-metadata.md#updating-and-locking-metadata) , so all we need to do is give them an owner, then update the metadata when we have a dapp definition. Ownership is again a shortcut to the most used parts of the authorization system. You can see this change when creating the Gumball resource for this section. let bucket_of_gumballs: Bucket = ResourceBuilder::new_fungible( OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) By adding the OwnerRole::Fixed to the resource, we can update the metadata after initial creation as long as we posses the owner_badge. Running the Gumball Machine dApp with Verification Metadata To get the Gumball Machine set up with verification metadata and working in the browser for yourself go to the official examples repo's instructions (https://github.com/radixdlt/official-examples/tree/main/step-by-step/11-gumball-machine-dapp-verification-data#setting-up-the-gumball-machine-dapp) and follow the steps. ## Run the Gumball Machine Front End dApp URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-run-the-gumball-machine-front-end-dapp Updated: 2026-02-18 Summary: If you aren't planning on using a front end, you can skip this section of the series. Build > Learning Step-by-Step > Run the Gumball Machine Front End dApp — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-run-the-gumball-machine-front-end-dapp.md) If you aren't planning on using a front end, you can skip this section of the series. In the previous Step-by-Step section we looked at the basics of how to create a dApp with a simple front end. In this one we'll take this further by applying the same concepts to our Gumball Machine package. We deployed a Gumball Machine package onto the test network in Use the Gumball Machine on Stokenet (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-use-the-gumball-machine-on-stokenet.md) , so all this section will introduce is a front end for that package. (You may need to go back to the Use the Gumball Machine on Stokenet (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-use-the-gumball-machine-on-stokenet.md) section if you don't already have a deployed and working Gumball Machine) This example will be more complex than the last, introducing the Gateway API to query network state, and using all our transaction manifests from Use the Gumball Machine on Stokenet. Just like the last section though, you'll want some familiarity with JavaScript and front end web development before jumping in. The code for the dapp referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/10-gumball-machine-front-end) . File Structure Our file structure will be similar to the last section, with the addition of a manifests directory to hold our transaction manifests, now converted into javascript functions so they can be called by main.js. / ├── client/ │ ├── index.html │ ├── main.js │ ├── package.json │ ├── style.css │ ├── manifests │ └── ... └── scrypto-package/ └── ... Gumball Machine Transactions The transactions sent by the front end cover the instantiate_gumball_machine function and all but one of the blueprint's methods (get_status isn't included as we can get the price and remaining number of gumballs from the component state via the Gateway API instead). Each transaction manifest is generated from one of the manifest functions in client/manifests/. These functions take the same arguments as the corresponding blueprint method plus necessary addresses. For example, the refill_gumball_machine manifest function: export const refillManifest = ( accountAddress, componentAddress, ownerBadgeAddress ) => ` CALL_METHOD Address("${accountAddress}") "create_proof_of_amount" Address("${ownerBadgeAddress}") Decimal("1") ; CALL_METHOD Address("${componentAddress}") "refill_gumball_machine" ; CALL_METHOD Address("${accountAddress}") "deposit_batch" Expression("ENTIRE_WORKTOP") ;` (You can also see that we've used our owner badge in this manifest, to pass the required proof.) The transactions are then sent to the wallet for signing and submission to the network: // Send manifest to wallet for signing const result = await rdt.walletApi.sendTransaction({ transactionManifest: manifest, version: 1, }); The effects of these transactions are tracked and displayed with the help of the Gateway API The Gateway API We saw how the Radix Developer Toolkit (RDT) can be used to interact with the network via the Radix Wallet in the last example. If we want to access the network more directly we use the Gateway API and it's npm package (https://github.com/radixdlt/babylon-gateway/tree/main/sdk/typescript/) . The Gateway API is set up in a similar way to RDT. It's imported into client/main.js: import { GatewayApiClient } from "@radixdlt/babylon-gateway-api-sdk"; Then we generate an instance so we can use it's various methods: const gatewayApi = GatewayApiClient.initialize(dappConfig); The Gateway API is used by the Radix Wallet and both the Console and Dashboard that we've been using to deploy packages and instantiate components. It gives us access to a wide array of different network interaction (https://radix-babylon-gateway-api.redoc.ly) , but we'll use it to query the network for the state of the ledger, the status of the network and of specific transactions. Specifically, we'll be using the Gateway API for: - Getting the status of various transactions after they've been submitted to the network via the wallet: // Fetch the transaction status from the Gateway API const transactionStatus = await gatewayApi.transaction.getStatus( result.value.transactionIntentHash ); - Finding the addresses of the new component and resources after instantiation (as component instantiation is a part of the front end this time): // Fetch the details of changes committed to ledger from Gateway API const committedDetails = await gatewayApi.transaction.getCommittedDetails( result.value.transactionIntentHash ); console.log("Instantiate committed details:", committedDetails); // Set addresses from details committed to the ledger in the transaction componentAddress = committedDetails.transaction.affected_global_entities[2]; ownerBadgeAddress = committedDetails.transaction.affected_global_entities[3]; gumballResourceAddress = committedDetails.transaction.affected_global_entities[4]; - Querying the ledger state of our Gumball Machine component to track price, number of gumballs and earnings: async function fetchAndShowGumballMachineState() { // Use Gateway API to fetch component details if (componentAddress) { const componentDetails = await gatewayApi.state.getEntityDetailsVaultAggregated( componentAddress ); console.log("Component Details:", componentDetails); // Get the price, number of gumballs, and earnings from the component state const price = componentDetails.details.state.fields.find( (field) => field.field_name === "price" )?.value; const numOfGumballs = componentDetails.fungible_resources.items.find( (item) => item.resource_address === gumballResourceAddress )?.vaults.items[0].amount; const earnings = componentDetails.fungible_resources.items.find( (item) => item.resource_address === xrdAddress )?.vaults.items[0].amount; We then use these values to update the page: // Show the values on the page document.getElementById("numOfGumballs").innerText = numOfGumballs; document.getElementById("price").innerText = price; document.getElementById("earnings").innerText = earnings + " XRD"; } } Running the Gumball Machine dApp To get the Gumball Machine working in the browser for yourself go to the official examples repo's instructions (https://github.com/radixdlt/official-examples/tree/main/step-by-step/10-gumball-machine-front-end#running-the-example) and follow the steps. ## Run Your First Front End dApp URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-run-your-first-front-end-dapp Updated: 2026-02-18 Summary: If you aren't planning on using a front end, you can skip this and the next sections in the series. Build > Learning Step-by-Step > Run Your First Front End dApp — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-run-your-first-front-end-dapp.md) If you aren't planning on using a front end, you can skip this and the next sections in the series. create-radix-app If your in a hurry to get a working front end dapp up and running, use create-radix-app (https://github.com/radixdlt/create-radix-app#readme) . It will give you some great templates to build more from. To understand how to turn a template into a dapp of your own, learn how Radix dapps work across this and the next few lessons. This section shows you how to make a very simple dapp with a front end. We deployed a package to the network and started interacting with it on ledger in the last section. Before we turn that Gumball Machine into a fully fledged dapp, we need to learn how to connect a front end to the Radix network and wallet. This section shows you how using the Hello package from the first and second sections (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-explain-your-first-scrypto-project.md) , adding a front end that sends a connected user a free Hello Token. We assume you have some familiarity with JavaScript and front end web development for this section, but it's kept as simple as possible. The code for the dapp referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/09-hello-token-front-end) . File Structure Now that we have more than just the scrypto package, we need to reorganize our project a little. We'll put the scrypto package in a scrypto-package directory and add a front end client directory. In the client directory we have an index.html file, a main.js file and a package.json file. The index.html file is the main page of our dapp. The main.js file is the javascript that runs on the page. The package.json file is be used to install the Radix Dapp Toolkit and has scripts to start and build the front end. There's also a small amount of styling added with the style.css file. / ├── client/ │ ├── index.html │ ├── main.js │ ├── package.json │ ├── style.css │ └── ... └── scrypto-package/ └── ... Dapp Definitions Every dapp needs a dapp definition; an account with metadata that identifies the dapp on the network. It creates a way for the Radix Wallet (and other clients) to know and verify what dapp it's interacting with as well as what components and resources that dapp is associated with. We are only going to connect this dapp to the Stokenet test network, but for Mainnet you will need to provide the dapp definition address in the client (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-definition-setup.md) as well. Without this, verification will not work and the Radix Wallet will not be able to connect. This is explained further in Set Verification Metadata (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-set-verification-metadata.md) section of this step-by-step The Radix Dapp Toolkit There are a collection of utilities that are needed to build a dapp on Radix. They include things like ways to query the state of the network, the wallet connect button, ways to send transactions, etc. We've collected some of them into an npm package called the Radix dApp Toolkit (https://github.com/radixdlt/radix-dapp-toolkit) . You can see some of it's essential uses in two places in this dapp: First it's used in the client/index.html file to connect the wallet: Second it's used in client/main.js to interact with the network and wallet. To do this we first need to import the toolkit: import { DataRequestBuilder, RadixDappToolkit, RadixNetwork, } from "@radixdlt/radix-dapp-toolkit"; Then we generate an instance of the toolkit so we can use it's various methods: // Create a dapp configuration object for the Radix Dapp Toolkit const dappConfig = { networkId: RadixNetwork.Stokenet, applicationVersion: "1.0.0", applicationName: "Hello Token dApp", applicationDappDefinitionAddress: dAppDefinitionAddress, }; // Instantiate DappToolkit to connect to the Radix wallet and network const rdt = RadixDappToolkit(dappConfig); We then use the toolkit's wallet API to do a few different things. First we decide what data we want to request from connected wallets: rdt.walletApi.setRequestData(DataRequestBuilder.accounts().exactly(1)); Next, to send a transaction we first need to get the user's account address: const accountAddress = rdt.walletApi.getWalletData().accounts[0].address; Which we use in a transaction manifest that we send back to the wallet: const result = await rdt.walletApi.sendTransaction({ transactionManifest: manifest, version: 1, }); The wallet then calculates fees, prompts the user to sign the transaction and sends it to the network. Running the Example All the steps to get the dapp up and running are in the official examples repo (https://github.com/radixdlt/official-examples/tree/main/step-by-step/09-hello-token-front-end#running-the-example) ## Use the Gumball Machine on Stokenet URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-use-the-gumball-machine-on-stokenet Updated: 2026-02-18 Summary: In the previous Step-by-Step section we allowed our gumball machine to mint its own gumballs. The blueprint still isn't quite ready for us to publish on the led Build > Learning Step-by-Step > Use the Gumball Machine on Stokenet — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-use-the-gumball-machine-on-stokenet.md) In the previous Step-by-Step section we allowed our gumball machine to mint its own gumballs. The blueprint still isn't quite ready for us to publish on the ledger though. Currently anyone can mint gumballs, so let's look at restricting it to the component's methods. We'll also add an icon for the gumball token, so it's easily identifiable in wallets and explorers. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/08-ledger-ready-gumball-machine) . Virtual Badges The Owner badge, like other badges we might make, is a resource. We make it the required evidence of ownership in one of the instantiation steps. In some cases it makes more sense to use a component's own address instead of creating a badge. This is known as the Virtual Badge pattern. The Virtual Badge pattern A component does not need to have its own badge to provide permission and perform authorized actions. Instead of a badge the component address itself can act as evidence to perform a action that requires permission. Address Reservation All components on the Radix ledger have a unique address assigned at instantiation, but the gumball token is defined inside the instantiate function. So we have access to the a component address inside the function before it's complete, we need to reserve it. We do so right at the start of instantiate_gumball_machine. let (address_reservation, component_address) = Runtime::allocate_component_address(GumballMachine::blueprint_id()); This give us the component_address to use for the minter rule, and the address_reservation to apply to the component in the last instantiation step. Restricting Mint Roles We update the minter rule to now require the component's own address to mint new gumballs. Only the component and therefore it's methods will be able to mint. .mint_roles(mint_roles! { minter => rule!(require(global_caller(component_address))); minter_updater => rule!(deny_all); }) The address_reservation is applied in the last instantiation step, giving our component the address we reserved and same address as used to mint new gumballs. .instantiate() .prepare_to_globalize(OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) // Apply the address reservation .with_address(address_reservation) .globalize(); Icons for Tokens We now have one last thing to do before we publish the gumball machine. Outside of resim the token will be displayed in a variety of ways in dapps and wallets. So these can be more visually appealing we need to add an icon for the gumball token. This is done with just an extra metadata field called icon_url. "icon_url" => Url::of("https://assets.radixdlt.com/icons/icon-gumball-pink.png"), locked; URLs and strings are not treated the same in the Radix ledger and so we need to use the Url type. The locked keyword, as before, is used to prevent the icon URL from being changed after the component is instantiated. We now have a blueprint we can publish on the ledger. Publishing the Package and Using the Gumball Machine The steps to build and publish the package, then use the Gumball Machine blueprint on the Stokenet test network are in our official examples repo (https://github.com/radixdlt/official-examples/tree/main/step-by-step/08-ledger-ready-gumball-machine#publishing-the-gumball-machine) . ## Create and Use Transaction Manifests URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-create-and-use-transaction-manifests Updated: 2026-02-18 Summary: In this Step-by-Step section there are no changes to the previous Refillable Gumball Machine(/learning-to-make-your-gumball-machine-refillable) blueprint. Inste Build > Learning Step-by-Step > Create and Use Transaction Manifests — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-create-and-use-transaction-manifests.md) In this Step-by-Step section there are no changes to the previous Refillable Gumball Machine (/learning-to-make-your-gumball-machine-refillable) blueprint. Instead we focus on transaction manifests, what they do and how to use them. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/07-gumball-machine-transaction-manifests) . Transaction Manifests Every transaction in the Radix Engine and Radix Engine Simulator (resim) has a manifest. Transaction manifests are lists of instructions that are followed by the engine. They are listed in order of execution in largely human readable language, so that in most cases we can see what a transaction will do without too much effort. If any step fails for any reason, the entire transaction fails and none of the steps are committed to the ledger on the network. Here is an example of simple transaction manifest to transfer 10 XRD from one account to another: CALL_METHOD Address("account_sim1c956qr3kxlgypxwst89j9yf24tjc7zxd4up38x37zr6q4jxdx9rhma") "lock_fee" Decimal("5") ; CALL_METHOD Address("account_sim1c956qr3kxlgypxwst89j9yf24tjc7zxd4up38x37zr6q4jxdx9rhma") "withdraw" Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("10") ; CALL_METHOD Address("account_sim1c9yeaya6pehau0fn7vgavuggeev64gahsh05dauae2uu25njk224xz") "deposit_batch" Expression("ENTIRE_WORKTOP") ; In the above: - The first method call withdraws 10 XRD from an account leaving it on the transaction worktop The Transaction Worktop The transaction worktop is a temporary storage area for resources during a transaction. Withdrawn resources are automatically placed on the worktop. The transaction cannot be completed until all resources are deposited elsewhere and the worktop is clear. - The second method call deposits everything on the worktop (the 10 XRD) to another account Generating Transaction Manifests Although you can write manifests by hand, it easier to use resim to generate them for you. With very few modifications they can then be used on the network as well as the simulator. resim generates transaction manifests for us for each transaction, so we don't have to create and apply them ourselves. Normally we don't have access to them, but we can use the --manifest flag to produce the manifest for a given transaction instead of run it. e.g. resim call-function package_sim1pk3cmat8st4ja2ms8mjqy2e9ptk8y6cx40v4qnfrkgnxcp2krkpr92 GumballMachine instantiate_gumball_machine 5 --manifest instantiate_gumball_machine.rtm This will print the manifest to the file instantiate_gumball_machine.rtm, where we can view, modify or run it. Transaction Fees Transactions in the Radix Engine require a small fee to pay for the resources used to run the transaction. The simulator has these too and you'll see an amount reserved for the fee at the start of each transaction manifest. That looks something like this: CALL_METHOD Address("component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh") "lock_fee" Decimal("5000") ; This locks a fee for the transaction from the faucet, a component that will produce an endless supply of XRD whenever we need it. The 5000 value is high enough to cover any incurred fee and the unspent part is left with the component. This component, of course, does not exist on the main Radix network, so these fees have to be handled in different ways. The Radix Wallet can automatically add them for us, but if our transaction do not involve the Wallet we will need to lock_fees in manifest so they be processed. For now though, as we are working in the simulator, we do not have to worry. Transaction Manifests in resim To run a transaction manifest in resim we can use the run command, e.g. resim run instantiate_gumball_machine.rtm Performing transaction in this way allows us to run multiple of the same transaction, with a much shorter command. Using Transaction Manifests Try using the Gumball Machine transaction manifests yourself. Follow the instructions in our official-examples repo (https://github.com/radixdlt/official-examples/tree/main/step-by-step/07-gumball-machine-transaction-manifests#using-transaction-manifests) . ## Make Your Gumball Machine Refillable URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-make-your-gumball-machine-refillable Updated: 2026-02-18 Summary: Building on the previous Give the Gumball Machine an Owner section, in this one we'll modify our first behaviour and give the machine the ability to mint more G Build > Learning Step-by-Step > Make Your Gumball Machine Refillable — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-make-your-gumball-machine-refillable.md) Building on the previous Give the Gumball Machine an Owner section, in this one we'll modify our first behaviour and give the machine the ability to mint more Gumball tokens. Using the updated behaviour we also add a new method for the owner to refill the gumball machine. The scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/06-refillable-gumball-machine) . Behaviours Tokens in the Radix Engine have a collection of behaviours that define what they can and can't do. Things like whether they can be withdrawn, burned, or minted. If we don't select their values, they will be set to their defaults (withdrawable, none-burnable, none-mintable etc.). If we want to change these we can do so by adding the desired behaviours, along with the rules to update them, to the token when it's first defined. Fixed Supply vs Mintable By default tokens have a fixed supply. We need make the gumball resource mintable, so we can create more after the initial batch. Our token is made mintable by adding mint_roles to the bucket_of_gumballs in its initial definition. .mint_roles(mint_roles! { minter => rule!(allow_all); minter_updater => rule!(deny_all); }) For now our minter rule allows anyone to mint new gumballs. We'll look at restricting this in later examples (see Use The Gumball Machine on Stokenet (learning-to-use-the-gumball-machine-on-stokenet.md#virtual-badges) ). We also need to decide on a rule for who can change the minter rule. For simplicity, we stop anyone from updating it. Resource Managers Now our gumball resource is mintable, we need to have access to its resource manager to mint it. We can do this by adding a ResourceManager to our component's state. struct GumballMachine { gum_resource_manager: ResourceManager, gumballs: Vault, collected_xrd: Vault, price: Decimal, } That also means we need to add the resource manager to the component at instantiation. We can do this with the resource_manager method on the bucket. let component = Self { gum_resource_manager: bucket_of_gumballs.resource_manager(), gumballs: Vault::with_bucket(bucket_of_gumballs), collected_xrd: Vault::new(XRD), price: price, } If we had no initial supply of gumballs, defining them would produce a resource manager instead of a bucket, which we could have used directly in the instantiation. New Methods We've added just one new method to the gumball machine this time: - refill_gumball_machine - Takes no arguments and refills the gumballs vault to 100 gumballs. pub fn refill_gumball_machine(&mut self) { let gumball_amount = 100 - self.gumballs.amount(); self.gumballs .put(self.gum_resource_manager.mint(gumball_amount)); } Using the Refillable Gumball Machine Have a go at using the Refillable Gumball Machine by following the instructions in the official-examples repo (https://github.com/radixdlt/official-examples/tree/main/step-by-step/06-refillable-gumball-machine#using-the-refillable-gumball-machine) . ## Give the Gumball Machine an Owner URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-give-the-gumball-machine-an-owner Updated: 2026-02-18 Summary: Web3 concerns ownership, ownership of digital assets, the distributed ownership of decentralized systems, and more. On Radix, with our asset oriented(../../esse Build > Learning Step-by-Step > Give the Gumball Machine an Owner — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-give-the-gumball-machine-an-owner.md) Web3 concerns ownership, ownership of digital assets, the distributed ownership of decentralized systems, and more. On Radix, with our asset oriented (https://github.com/gguuttss/radix-docs/blob/master/essentials/asset-oriented.md) stack, we implicitly have an easy way to track asset ownership by possession. e.g. an asset is owned by a vault it sits in; a vault is owned by a component it's a internal to. But a component and it's blueprint don't need to sit within another entity, so for these we declare an owner explicitly. We can then give the owner access to specific privileges and/or they can receive royalties (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/royalties/using-royalties.md) when their component or blueprint is used. Having defined owners works as a useful shortcut in our Authorization Approach (https://github.com/gguuttss/radix-docs/blob/master/essentials/authorization-approach.md) . We'll expand further on authorization in later Step-by-Step sections, but for now, think of an owner as the admin for that entity. Components and resources have owners. Being able to prove your an owner, with a badge (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/user-badge-pattern.md) , can give you access to more of the component. Therefore, declaring who owners are gives you more power to decide who can use a component and how, and the Radix Engin enforces your decisions and guaranties their effects. We can use this to do things like withdraw the collected XRD in our gumball machine without worrying that anyone else can. We keep track of ownership with badges (learning-to-give-the-gumball-machine-an-owner.md#badges) . Before we dig into what they are and how to use them, let's look at what ownership means for our gumball machine? Ownership and Authorization Each component on the radix ledger has an owner (though it may be set to None, OwnerRole::None). When writing a blueprint we can make proof of ownership required for individual method calls on the component. For our gumball machine that means, when it's instantiated the instantiator will become the owner and will be the only account who can call some methods on that gumball machine component. We'll choose changing the price of the gumballs and withdrawing the XRD from the machine as our owner restricted methods. You can see this applied in the Gumball Machine with an Owner example (https://github.com/radixdlt/official-examples/tree/main/step-by-step/05-gumball-machine-with-owner) in our official examples. By using the enable_method_auth! macro at the top of our blueprint code we can decide which methods require proof of ownership and which are public. restrict_to: [OWNER] means that the method requires proof of ownership. `enable_method_auth!` { methods { buy_gumball => PUBLIC; get_status => PUBLIC; set_price => restrict_to: [OWNER]; withdraw_earnings => restrict_to: [OWNER]; } } Badges Evidence of ownership is achieved with a Badge. A Badge can be any normal resource (fungible or non-fungible) who's possession shows that the holder is the owner of a component. Using a badge's to prove ownership also makes it possible to transfer ownership between accounts. Proofs Proofs (../scrypto-1/auth/call-a-protected-method-function.md#proofs) are used to prove that a caller has a required badge. Proofs are a special type of resource that only exist for the length of the transaction. They allow a caller to prove that they have a resource without having to risk transferring a badge away and hoping that it is transferred back. A proof of resource is created from a vault (or bucket) and automatically added to the Authorization Zone (../scrypto-1/auth/call-a-protected-method-function.md#the-authorization-zone) where methods will automatically check for any required proofs. We use a fungible resource for our owner badge: let owner_badge: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) .metadata(metadata!(init{ "name" => "Gumball Machine Owner Badge", locked; })) .divisibility(DIVISIBILITY_NONE) .mint_initial_supply(1) .into(); By making it indivisible and giving it a fixed supply of 1, we make sure there can only be one owner. We then make the owner_badge proof the evidence of ownership when the component is instantiated with a require rule: .instantiate() .prepare_to_globalize(OwnerRole::Fixed(rule!(require( owner_badge.resource_address() )))) .globalize(); The Owner role is now fixed to (cannot be changed from) proof of the owner_badge resource. The instantiated component and the owner_badge are then returned from the function: (component, owner_badge) Now, when an owner restricted method is called, the caller has to present a proof of the owner_badge or the transaction will fail. Restricted Methods Now we can restrict methods to just the owner of the gumball machine, we've updated the blueprint from the Build a Gumball Machine (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-build-a-gumball-machine.md) version. The two new methods are: - set_price - Allows the owner to set the price of the gumballs. pub fn set_price(&mut self, price: Decimal) { self.price = price } - withdraw_earnings - Lets the owner withdraw any XRD collected in the gumball machine, from bought gumballs. pub fn withdraw_earnings(&mut self) -> Bucket { self.collected_xrd.take_all() } These additions are simple, but they start to show how we can use badges to control access to our components. To try out our new gumball machine and it's owner badge, have a go by following the instructions in the official-examples Github repo (https://github.com/radixdlt/official-examples/tree/main/step-by-step/05-gumball-machine-with-owner#using-the-gumball-machine-as-an-owner) . Closing Thoughts Hopefully you now have a grasp of how the Owner role works as a shortcut for a usage pattern where you have a single admin, as well as the basics of the authorization system in Scrypto. The tools and system shown here form the foundation of method access limitation (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/structure-roles-and-methods.md) , allowing us to decide exactly who can do what with any component generated from our blueprints. There's still much more to the Scrypto authorisation system, some of which we'll cover in the next part of the Learning Step-by-Step, Make Your Gumball Machine Refillable (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-make-your-gumball-machine-refillable.md) . If you're keen to learn even more now though, you can go to the Authorization (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/auth/README.md) section of these docs. Let us know if you find any section helpful or not by clicking one of the buttons below ⬇ ## Build a Gumball Machine URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-build-a-gumball-machine Updated: 2026-02-18 Summary: It's time to take a look at our next Scrypto package. The Gumball Machine is a Radix favorite. Here we cover the creation of its simplest version, which allows Build > Learning Step-by-Step > Build a Gumball Machine — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-build-a-gumball-machine.md) It's time to take a look at our next Scrypto package. The Gumball Machine is a Radix favorite. Here we cover the creation of its simplest version, which allows users to purchase a gumball in exchange for XRD. This will be the basis for next few steps in our learning journey. In addition to the most basic operation, this particular example allows the instantiator to set the price (in XRD) of gumballs, and allows callers to submit more than the exact price required. Callers will receive their change (if any) in addition to their tasty gumball. The Radix engine works out what change remains allowing us to easily return it with the bought gumball. Example Code The Scrypto package referenced in this section can be found in our official examples here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/04-gumball-machine) . If you wish to clone the whole repository you can run git clone https://github.com/radixdlt/official-examples.git To view this example move in to it's directory cd official-examples/step-by-step/04-gumball-machine then open in your chosen IDE or code editor. e.g. for VSCode run code . Resources and Data struct GumballMachine { gumballs: Vault, collected_xrd: Vault, price: Decimal } Our gumball machine will hold two kinds of resources in vaults: the gumballs to be dispensed, and any XRD which has been collected as payment. We'll also need to maintain the price, which we're using Decimal for. Decimal is a bounded type appropriate for use for resource quantities. In Scrypto, it has a fixed precision of 10-18, and a maximum value of 296. Unless we're selling spectacularly expensive gumballs, this should be fine. If we wanted an unbounded type to use for quantity, we could use BigDecimal instead. Getting Ready for Instantiation In order to instantiate a new gumball machine, the only input we need from the caller is to set the price of each gumball. After creation, we'll be returning the address of our new component, so we'll set our function signature up appropriately: pub fn instantiate_gumball_machine(price: Decimal) -> ComponentAddress { Within the instantiate_gumball_machine function, the first thing we need to do is create a new supply of gumballs which we intend to populate our new component with: let bucket_of_gumballs: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) .divisibility(DIVISIBILITY_NONE) .metadata(metadata!( init { "name" => "Gumball", locked; "symbol" => "GUM", locked; "description" => "A delicious gumball", locked; } )) .mint_initial_supply(100) .into(); All that's left is to populate our GumballMachine struct with our supply of gumballs, the user-specified price, and an empty Vault which we will force to contain XRD. Then we'll instantiate it, which returns the address, and we'll return that to the caller. Self { gumballs: Vault::with_bucket(bucket_of_gumballs), collected_xrd: Vault::new(XRD), price: price, } .instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() Allowing Callers to Buy Gumballs In order to sell a gumball, we just need the method caller to pass us in enough XRD to cover the price. We'll return the purchased gumball, as well as giving back their change if they overpaid, so we actually need to return two buckets. This is easily accomplished by simply returning a tuple, giving us a method signature like this: pub fn buy_gumball(&mut self, mut payment: Bucket) -> (Bucket, Bucket) { Note that we used &mut self because our reference to ourself must be mutable; we will be changing the contents of our vaults, if all goes well. Accomplishing the actual mechanics of putting the XRD in our vault, taking a gumball out, and then returning the gumball as well as whatever is left in the caller's input bucket are trivial: let our_share = payment.take(self.price); self.collected_xrd.put(our_share); (self.gumballs.take(1), payment) Note that we didn't have to check that the input bucket contained XRD...when we attempt to put tokens into our collected_xrd Vault (which was initialized to contain XRD), we'll get a runtime error if we try to put in anything else and the transaction would not take place. Unstored Tokens If tokens do not end up in a vault, you would need to confirm they are the correct resource by comparing the resource address with your desired address, e.g. assert_eq!(payment.resource_address(), XRD); Transactions cannot complete with any remaining tokens not stored in vaults, so this is needed only when tokens are received and either transferred away again or burned in the same transaction. Similarly, we'll get a runtime error if we try to take out a quantity matching the price, and find that there is insufficient quantity present. Finally, if the user provided exactly the correct amount of XRD as input, when we return their payment bucket, it will simply contain quantity 0. Checking the Price and Amount of Gumballs To check the price and availability of gumballs we have a method that returns both in a Status struct: pub fn get_status(&self) -> Status { Status { price: self.price, amount: self.gumballs.amount(), } } Status is a custom type that we've defined outside of our blueprint, above it in the same file in this case. So that it can work inside the blueprint we give it the #[derive(ScryptoSbor)] attribute. #[derive(ScryptoSbor)] pub struct Status { pub price: Decimal, pub amount: Decimal, } The #[derive(ScryptoSbor)] attribute allows data to be encoded into and decoded from Scrypto SBOR (sbor) , the serialization data format required by the Radix Engine. This makes it required for all custom types. Using the Gumball Machine To try using the Gumball machine in the Radix Engine Simulator follow the instructions in the official-examples repo (https://github.com/radixdlt/official-examples/blob/main/step-by-step/04-gumball-machine/README.md#using-the-gumball-machine) . resim Makes Things Easy To make things easy resim hides some steps from us, steps in the form of transaction manifests. A transaction manifest is a list of instructions that must be submitted to the network for the transaction to take place. resim automatically generates and submits these manifest files for us. We will revisit this in later examples but, if you'd like to see these hidden manifest you can add the --manifest flag to the resim command. Try it out with, resim call-method buy_gumball --manifest manifest.rtm This will output the manifest file to manifest.rtm in the current directory, where you can inspect it. Let us know if you find any section helpful or not by clicking one of the buttons below ⬇. You can also let us know about a typo or outdated information using the same buttons. ## Create Your First Custom Resource URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-create-your-first-custom-resource Updated: 2026-02-18 Summary: Using Scrypto you can create a wide range of resources, whether fungible, non-fungible, simple tokens or more complex authorization badges. Being able to modify Build > Learning Step-by-Step > Create Your First Custom Resource — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-create-your-first-custom-resource.md) Using Scrypto you can create a wide range of resources, whether fungible, non-fungible, simple tokens or more complex authorization badges. Being able to modify their properties allows you to accurately represent your chosen entity/idea for each. In Run Your First Project (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-run-your-first-scrypto-project.md) you've created your first resource. Before we start modifying resources let's understand what they are a little more. As an Asset-Oriented (https://github.com/gguuttss/radix-docs/blob/master/essentials/asset-oriented.md) language resources/assets are not implemented on top of Scrypto, but are native to it. Many other platforms do not have resources at the core of their engines. Asset's and their behaviours must be implemented through additional code that requires checking and tracking that transactions have happened as expected. With Scrypto the behaviours are guaranteed. To make resources behave intuitively and safely, we've also made the language and Radix Engine treat them like real world objects, making things like double spending (where a resource is counted twice but has only been transferred once) impossible, and ensuring no resources can be lost in a transaction. In other words, every transaction concludes with all resources securely transferred from and to an account or other component. The engine itself does the work of tracking and checking the deeper level of the transaction, so your code can focus on utility and application. Resources on Radix can have many different roles, so we have given them many different properties and behaviours that can be customized. Although much of the depth of these features lies in  resource behaviors (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/resource-behaviors.md) , resource  metadata (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/metadata/README.md) is where a tokens name and its symbol are stored. For a deeper dive into resources, explore the the Resources (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/resources/README.md) section of these docs, but for now let's move our focus to how we can create and customise our own. Creating a Resource If you've been following the Learning Step-By-Step (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) , you've already created your first resource, the Hello token. With a few adjustments to the blueprint code we can customize that token and make it more like the resources you will soon use in your own projects. In fact, you can create multiple resource types from the same updated code. Let's first take a look at how the Hello token is given its name and symbol. To create your own Hello blueprint run the command scrypto new-package hello in your terminal. Then look in the generated package, hello > src > lib.rs to view and edit the blueprint code. Metadata In many cases the resources you make will need to have information on them that makes them human understandable. Things like the name and description of a resource will need to be somehow associated with it. That information is held in metadata (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/metadata/README.md) . There are a variety of standard fields (https://github.com/gguuttss/radix-docs/blob/master/reference/standards/metadata-standards/metadata-for-wallet-display.md) that we can add (or any custom fields we need). A resource's name and symbol are amongst the fields used by wallets and dApps to display the resource to users, so are the first metadata fields we'll set. You can see them being set in the Hello blueprint in the in the instantiate_hello function here: .metadata(metadata! { init { "name" => "HelloToken", locked; "symbol" => "HT", locked; } }) With different values here, we can create a new token with a different name and symbol. Metadata fields that have url values must be of type Url and not String, as they are treated differently by the Radix engine. To do this convert the String to a Url with Url::of(), e.g. "icon_url" => Url::of("https://example.url/icon.png"), locked; Customizing Your Resource We can update the Hello blueprint to create resources with any name and symbol metadata that we parse in. We start by adding input arguments for the two fields to the instantiate_hello function. pub fn instantiate_hello(name: String, symbol: String) -> Global { // --snip-- } Then adjust the resource builder to use the new arguments. let my_bucket: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) .divisibility(DIVISIBILITY_MAXIMUM) .metadata(metadata! { init { "name" => name, locked; "symbol" => symbol, locked; } }) .mint_initial_supply(1000) .into(); Now when you create a component from this blueprint you input your desired name and symbol as arguments. These will be used in the instantiate (create) function to build a new resource with that name and symbol. The new resource will be stored in the component's vault, ready to be transferred out with the free_token method. To give it a go follow the instructions in the official-examples repo here (https://github.com/radixdlt/official-examples/tree/main/step-by-step/03-create-a-custom-resource#using-the-component) . One Blueprint, Multiple Resources A fundamental part of how we made the Radix Engine work is to reuse code (https://github.com/gguuttss/radix-docs/blob/master/essentials/reusable-code.md) . This means components are instantiated from the blueprints you write, and these components are the live objects that can be interacted with on the network, similar to smart contracts on other platforms. From a blueprint you construct a component (or more than one component). This blueprints and components (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/blueprints-and-components.md) relationship gives us a few useful things we can do. - You can create multiple components from the same blueprint - You can use existing blueprints made by others to create your own components - You can make your own blueprints with the intention of sharing them with the community (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/scrypto-design-patterns/reusable-blueprints-pattern.md) With the updated Hello blueprint we can explore the first of these and create multiple components that each supply a different token when we call the free token method. There are more steps to try this out in the official-examples repo (https://github.com/radixdlt/official-examples/tree/main/step-by-step/03-create-a-custom-resource#multiple-components-from-one-blueprint) . Closing Thoughts You've now taken your first step into Scrypto resource creation and customization. Customizing resource metadata has given us tokens with new names and symbols making them accessible to users in wallets and dApps. We're also able to see some of the power of Scrypto with reusable blueprints. They've made us more efficient and allowed you to use only a single, versatile blueprint to create many differently named tokens. We develop these steps of customisation further in the next section introducing a Radix favourite, the Gumball Machine (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-build-a-gumball-machine.md) . Let us know if you find any section helpful or not by clicking one of the buttons below ⬇. You can also let us know about a typo or outdated information using the same buttons. ## Explain Your First Scrypto Project URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-explain-your-first-scrypto-project Updated: 2026-02-18 Summary: After using the Hello package in Run Your First Scrypto Project(/learning-to-run-your-first-scrypto-project) I'm sure you'll want to better understand what you Build > Learning Step-by-Step > Explain Your First Scrypto Project — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-explain-your-first-scrypto-project.md) After using the Hello package in Run Your First Scrypto Project (/learning-to-run-your-first-scrypto-project) I'm sure you'll want to better understand what you just did. We give you a full explanation below, where you'll get a more of a taste for of how asset-oriented programming with Scrypto for DeFi works. You can find the Hello package code discussed here on github (https://github.com/radixdlt/official-examples/tree/main/step-by-step/02-hello-token-explained) , or by simply creating a new package with scrypto new-package . The Hello package is a simple one blueprint package. The Scrypto component it creates gives out a Hello Token whenever it's free_token method is called. File Structure For every new Scrypto package, there are three main files/folders: - The src folder, which contains all the source code; - The test folder, which contains all the test code; - The Cargo.toml configuration file. Cargo is the default package manager for Scrypto. It downloads all the dependencies, compiles source code, and makes a binary executable. This file specifies the dependencies and compile configuration. Blueprint In the src folder, there is a lib.rs file, which contains our blueprint code. Components, Blueprints and Packages A blueprint is the code that defines a single working part of our application. When it is instantiated it becomes an interactive component running in the Radix Engine. One or multiple blueprints grouped together, ready to be instantiated are a package. We only have one blueprint in the package called Hello, which defines: - The state structure of all Hello components; a single vault (learning-to-explain-your-first-scrypto-project.md#buckets-vaults) . A vault is a container for resources (learning-to-explain-your-first-scrypto-project.md#resource-creation) ; - A function instantiate_hello, which instantiates a Hello component; - A method free_token, which returns a bucket of HelloToken (from the component vault) when called. use scrypto::prelude::*; #[blueprint] mod hello { // 1. The state structure of all `Hello` components struct Hello { sample_vault: Vault, } impl Hello { // 2. A function which instantiates a `Hello` component pub fn instantiate_hello() -> Global { // --snip-- } // 3. A method which returns a bucket of `HelloToken` when invoked pub fn free_token(&mut self) -> Bucket { // --snip-- } } } 1. Defining Component Structure struct Hello { // A vault to store resources sample_vault: Vault, } Every blueprint must start with a struct defining what is stored where in the component. The struct has the same name as the blueprint. 2. Instantiating a Component from a Package pub fn instantiate_hello() -> Global { // --snip-- } Blueprints need to have instantiate functions so they can be used to create components. This is usually named starting with instantiate_ or new_. In our case the function is instantiate_hello() Resource Creation When a Hello component is instantiated, so is an initial supply of HelloToken resources. Resources In Scrypto, assets like tokens and non-fungibles are not implemented as blueprints or components. Instead, they are types of resources that are configured and requested directly from the system. To create a new resource, we: - Use the ResourceBuilder to create a new fungible resource; - Specify the number of decimal places the resource can be divided into; - Specifying the resource metadata, like name and symbol; - Specifying the initial supply of the resource. - Convert the bucket type returned from minting initial supply (here: FungibleBucket) to a more general Bucket. // 1. Define a new fungible resource with ResourceBuilder let my_bucket: Bucket = ResourceBuilder::new_fungible(OwnerRole::None) // 2. Set the max number of decimal places to 18 .divisibility(DIVISIBILITY_MAXIMUM) // 3. Set the metadata .metadata(metadata!{ init { "name" => "Hello Token", locked; "symbol" => "HT", locked; } }) // 4. Create the initial supply .mint_initial_supply(1000) // 5. Convert sub bucket type to Bucket. .into(); Buckets & Vaults Buckets A bucket is a temporary container for resources. When a resource is created, it is in a bucket. As buckets only exist to move resources around we have to: - Put the new resources in a vault. // 5. Put the new resources in a vault sample_vault: Vault::with_bucket(my_bucket), Vaults A vault is a permanent container for resources and where resources must be stored. Instantiation Finally, we can instantiate a Hello component by: - Calling instantiate - Making the component available in the network by calling globalize Self { sample_vault: Vault::with_bucket(my_bucket), } // 6. Instantiate the component .instantiate() // 7. Make the component available in the network .prepare_to_globalize(OwnerRole::None) .globalize() (The OwnerRole is explained in Give the Gumball Machine an Owner (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-give-the-gumball-machine-an-owner.md) ) This completes the instantiate_hello function which creates a new HelloToken definition with an initial supply of 1000, stores the 1000 tokens inside a state struct and instantiates a new component from that state. 3. Component Methods pub fn free_token(&mut self) -> Bucket { // --snip-- } Methods can only be called on instantiated components, not blueprints. Our Hello component has one method, free_token, which first logs the component's token balance then returns a HelloToken. Logs are explained more in the Logging section of these docs (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/logging.md) . free_token uses the info! macro for logging: info!( "My balance is: {} HelloToken. Now giving away a token!", self.sample_vault.amount() ); The method then returns a bucket of one token, ready to transfer to another component or account: // Return 1 HelloToken, taken from the vault self.sample_vault.take(1) The lack of ; at the end of the line means that the result of the last expression is returned from the method. This also applies to the instantiate_hello function where the component is returned. Wrapping Up That's it! You now know how the Hello package works. The information here is the foundation for the rest of this learning journey (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) . The next step is to Create Your First Custom Resource (/learning-to-create-your-first-custom-resource) . Where we look in more detail at metadata, why to change it and how. Let us know if you find any section helpful or not by clicking one of the buttons below ⬇. You can also let us know about a typo or outdated information using the same buttons. ## Run Your First Scrypto Project URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-to-run-your-first-scrypto-project Updated: 2026-02-18 Summary: In this learning journey section, we'll walk you through running and interacting with Scrypto within the Radix Engine Simulator. By the end, you'll have locally Build > Learning Step-by-Step > Run Your First Scrypto Project — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-run-your-first-scrypto-project.md) "Hello, World!" examples usually provide the simplest possible piece of code to understand the basics of a new language. However, Scrypto isn't just a typical language – it is specialized for the management of assets on a decentralized network. So rather than just printing Hello, World! to a console, we'll hand out a HelloToken! In this learning journey section, we'll walk you through running and interacting with Scrypto within the Radix Engine Simulator. By the end, you'll have locally deployed and used your first Scrypto "smart contract". This section focuses on getting existing code working on your machine. In the Explain Your First Scrypto Project (/learning-to-explain-your-first-scrypto-project) article, we will delve deeper into what the code is doing. Before we begin Make sure you have the Scrypto toolchain installed (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/getting-rust-scrypto.md) . If you do, let's get started! Developing on Radix A simple software development workflow might involve writing code, conducting local testing, deploying to a remote test environment for further testing, and finally deploying to the production environment. When it comes to creating smart contracts on Radix, we follow a similar process. We use Scrypto to write the code and the Radix Engine Simulator (resim) for local testing. Once you're satisfied with your code, you can deploy it to our test network (Stokenet) and then to the main Radix Network (Mainnet). Our development steps are: - Create a package. - Define code with the business logic - Deploy the code package - using resim for local testing, Stokenet for testing, or Mainnet for production. - Instantiate a component from the package. Let's start these steps. Creating a New Blueprint Package Development starts by creating a package. We'll use the scrypto command line tool for this. - Open a new terminal in the directory where you'd like to create your new project and run the command below. scrypto new-package tutorial This scaffolds a simple package containing a Hello blueprint. Blueprints and Packages Radix splits the concept of "smart contracts" into two parts: - A blueprint is the code that defines the behavior of the application and what state it will manage. Blueprints are instantiated into components, which then get their own address and own state. In object-oriented programming terms, you can think of it like a blueprint being a class definition and a component being an object which is an instance of that class. - Blueprints are always part of a package when deployed. Multiple blueprints may be grouped together in a single package. - Open the new package directory in an IDE/code editor with Rust support. Choosing a code editor VSCode is where we've focused most of our efforts on tooling so far. Find the instructions on how to setup VSCode here (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/choosing-an-ide.md) . - Then open up src/lib.rs to see the source that we’re going to compile and deploy to the simulator. code tutorial/src/lib.rs The Hello blueprint has one function which creates a new component with a supply of tokens (instantiate_hello), and one method to get one of those tokens from the component (free_token). A detailed explanation of this blueprint is in the next learning journey section (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-explain-your-first-scrypto-project.md) . So far we've completed steps; - Create a package, and - Define blueprints with the business logic. Next we deploy the package in the Radix Engine Simulator. Deploying the package locally via Radix Engine Simulator resim On distributed ledgers like Radix, each action on the network incurs transaction fees. Transaction fees (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/costing-and-limits/transaction-costing.md) reflect the load each transaction puts on the network and are used to compensate the network validators (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/validator.md) . To deploy our package, we will therefore need an account with enough XRD to cover the transaction fees. We can create both in resim. Creating an Account resim has simple commands to create and use accounts in the simulator environment. - Most resim commands use an existing account, so before we do anything else, lets create one. resim new-account You should get a success message. At the bottom of the output you should see the created account component address, public/private key, and the NonFungibleGlobalId of your owner badge. e.g.: A new account has been created! Account component address: account_sim1c956qr3kxlgypxwst89j9yf24tjc7zxd4up38x37zr6q4jxdx9rhma Public key: 03b9813c2244f864d3f886a8d8716aff5d234fab83a1c6f39d60999ddb01a15f98 Private key: 2ce29b819e212e999c8bd3ab05f529b7868a4667dfe6db03b257ab0560525635 Owner badge: resource_sim1nfzf2h73frult99zd060vfcml5kncq3mxpthusm9lkglvhsr0guahy:#1# Set up as default account since you had none, you can change it using `set-default-account`. Warning Do not attempt to use this private key on the main network! It is seeded from a predictable value and stored in plaintext. - If you do not see the line about setting your default account, then you already created an account previously. Either reset your entire simulator with the resim reset command and try again, or set this new account as your default account with the following command: resim set-default-account Displaying the Account - You can look at the resources in your new account by running: resim show resim show The show command can be used with any account component, component or resource address. You will get something like this: Component Address: account_sim1c956qr3kxlgypxwst89j9yf24tjc7zxd4up38x37zr6q4jxdx9rhma Blueprint ID: { package_address: package_sim1pkgxxxxxxxxxaccntxxxxxxxxxx000929625493xxxxxxxxxrn8jm6, blueprint_name: "Account" } Owned Fungible Resources: 1 └─ resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3: 10000 Radix (XRD) Owned Non-fungibles Resources: 1 └─ resource_sim1nfzf2h73frult99zd060vfcml5kncq3mxpthusm9lkglvhsr0guahy: 1 Owner Badge └─ #1# Metadata: 0 Note the Owned Fungible Resources: section, which indicates that the account already holds 1000 XRD, which will be more than enough to deploy our first package. Deploying the Package - To publish/deploy our package, we need to make sure our terminal has navigated to the project root directory containing the Cargo.toml file: cd tutorial - Then we use the radix engine simulator to publish/deploy our package locally resim publish . Once this finishes you should see the published package’s address at the end of the output. e.g. Success! New Package: package_sim1pk3cmat8st4ja2ms8mjqy2e9ptk8y6cx40v4qnfrkgnxcp2krkpr92 - Store the address for the last of our 4 development steps, instantiating a component from our package Component Instantiation - Now our package is deployed "on ledger" in the simulator, we're going to create a component by calling the instantiate_hello function on the Hello blueprint. resim call-function Hello instantiate_hello You can find an example and explanation of the function call output by clicking here The instantiate_hello output should look something like this: Transaction Status: COMMITTED SUCCESS Transaction Cost: 0.65177966419 XRD ├─ Network execution: 0.30327475 XRD, 6065495 execution cost units ├─ Network finalization: 0.1260127 XRD, 2520254 finalization cost units ├─ Tip: 0 XRD ├─ Network Storage: 0.22249221419 XRD └─ Royalties: 0 XRD Logs: 0 Events: 7 ├─ Emitter: Method { node: internal_vault_sim1tz9uaalv8g3ahmwep2trlyj2m3zn7rstm9pwessa3k56me2fcduq2u, module_id: Main } Event: LockFeeEvent { amount: Decimal("5000"), } ├─ Emitter: Method { node: resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w, module_id: Main } Event: MintFungibleResourceEvent { amount: Decimal("1000"), } ├─ Emitter: Method { node: resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w, module_id: Main } Event: VaultCreationEvent { vault_id: NodeId(hex("5855080b052e56996aeedf0045f0ab8b2d6081e2bec248684ba4b5c2b03e")), } ├─ Emitter: Method { node: internal_vault_sim1tp2sszc99etfj6hwmuqytu9t3vkkpq0zhmpys6zt5j6u9vp7u6438n, module_id: Main } Event: DepositEvent { amount: Decimal("1000"), } ├─ Emitter: Method { node: internal_vault_sim1tz9uaalv8g3ahmwep2trlyj2m3zn7rstm9pwessa3k56me2fcduq2u, module_id: Main } Event: PayFeeEvent { amount: Decimal("0.65177966419"), } ├─ Emitter: Method { node: internal_vault_sim1tpsesv77qvw782kknjks9g3x2msg8cc8ldshk28pkf6m6lkhun3sel, module_id: Main } Event: DepositEvent { amount: Decimal("0.325889832095"), } └─ Emitter: Method { node: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3, module_id: Main } Event: BurnFungibleResourceEvent { amount: Decimal("0.325889832095"), } Outputs: 3 ├─ Unit ├─ Reference("component_sim1crkp7q8sfhg7xa0xvqtdjezltj3hams2hrk4ztzqs2c90sy0cslv6a") └─ Enum::[0] Balance Changes: 3 ├─ Vault: internal_vault_sim1tz9uaalv8g3ahmwep2trlyj2m3zn7rstm9pwessa3k56me2fcduq2u ResAddr: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 Change: -0.65177966419 ├─ Vault: internal_vault_sim1tp2sszc99etfj6hwmuqytu9t3vkkpq0zhmpys6zt5j6u9vp7u6438n ResAddr: resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w Change: 1000 └─ Vault: internal_vault_sim1tpsesv77qvw782kknjks9g3x2msg8cc8ldshk28pkf6m6lkhun3sel ResAddr: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 Change: 0.325889832095 New Entities: 2 └─ Component: component_sim1crkp7q8sfhg7xa0xvqtdjezltj3hams2hrk4ztzqs2c90sy0cslv6a └─ Resource: resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w This reads as: Caption Description Transaction Status Transaction was successful. Transaction Cost This sections describes the fees you had to pay to execute this transaction. Read more about fees here (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/costing-and-limits/README.md) . Logs There were no logs. Events The transaction first executed a call to lock the fees. Then it executed one CallFunction instruction, that called the instantiate_hello function which in turn issued a CallMethod instruction. The events emitted show us what happened as a result of the manifest instructions. Outputs and Balance Changes The new instantiated ComponentAddress and information about resource movement is displayed here New Entities There were two new entities created: the Hello component and the HelloToken resource. This creates two new entities with different addresses: - a new HelloToken with a ResourceAddress, - and your fresh Hello component with a ComponentAddress. Store the component address for later use. We have now completed all the development steps. - Create a package. - Define code with the business logic - Deploy the code package to resim. - Instantiate a component from the package. Using the Hello Component in resim Now that we have our component in our local environment, we can interact with the it. - Let's call the free_token method. resim call-method free_token This uses resim call-method rather than the just used resim call-function since we are now calling a method on an instantiated component instead of a function on a blueprint. You can find an example and explanation of the method call output by clicking here The free_token output should look something like this: Transaction Status: COMMITTED SUCCESS Transaction Cost: 0.43160387771 XRD ├─ Network execution: 0.30021485 XRD, 6004297 execution cost units ├─ Network finalization: 0.0363077 XRD, 726154 finalization cost units ├─ Tip: 0 XRD ├─ Network Storage: 0.09508132771 XRD └─ Royalties: 0 XRD Logs: 1 └─ [INFO ] My balance is: 1000 HelloToken. Now giving away a token! Events: 8 ├─ Emitter: Method { node: internal_vault_sim1tz9uaalv8g3ahmwep2trlyj2m3zn7rstm9pwessa3k56me2fcduq2u, module_id: Main } Event: LockFeeEvent { amount: Decimal("5000"), } ├─ Emitter: Method { node: internal_vault_sim1tp2sszc99etfj6hwmuqytu9t3vkkpq0zhmpys6zt5j6u9vp7u6438n, module_id: Main } Event: WithdrawEvent { amount: Decimal("1"), } ├─ Emitter: Method { node: resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w, module_id: Main } Event: VaultCreationEvent { vault_id: NodeId(hex("5825b3cabab992e4d46fda2ff9dad46a8b2021785ca645d0e2986f0824a2")), } ├─ Emitter: Method { node: internal_vault_sim1tqjm8j46hxfwf4r0mghlnkk5d29jqgtctjnyt58znphssf9zm6gyg5, module_id: Main } Event: DepositEvent { amount: Decimal("1"), } ├─ Emitter: Method { node: account_sim1c956qr3kxlgypxwst89j9yf24tjc7zxd4up38x37zr6q4jxdx9rhma, module_id: Main } Event: DepositEvent::Fungible( ResourceAddress(Reference("resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w"), Decimal("1"), ) ├─ Emitter: Method { node: internal_vault_sim1tz9uaalv8g3ahmwep2trlyj2m3zn7rstm9pwessa3k56me2fcduq2u, module_id: Main } Event: PayFeeEvent { amount: Decimal("0.43160387771"), } ├─ Emitter: Method { node: internal_vault_sim1tpsesv77qvw782kknjks9g3x2msg8cc8ldshk28pkf6m6lkhun3sel, module_id: Main } Event: DepositEvent { amount: Decimal("0.215801938855"), } └─ Emitter: Method { node: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3, module_id: Main } Event: BurnFungibleResourceEvent { amount: Decimal("0.215801938855"), } Outputs: 3 ├─ Unit ├─ Own("internal_component_sim1lruex79e7m0tgvp852a0298xlmk5acltxnu8rp785yufcd26vcf5hg") └─ Enum::[0] Balance Changes: 4 ├─ Vault: internal_vault_sim1tz9uaalv8g3ahmwep2trlyj2m3zn7rstm9pwessa3k56me2fcduq2u ResAddr: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 Change: -0.43160387771 ├─ Vault: internal_vault_sim1tp2sszc99etfj6hwmuqytu9t3vkkpq0zhmpys6zt5j6u9vp7u6438n ResAddr: resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w Change: -1 ├─ Vault: internal_vault_sim1tqjm8j46hxfwf4r0mghlnkk5d29jqgtctjnyt58znphssf9zm6gyg5 ResAddr: resource_sim1t4h3kupr5l95w6ufpuysl0afun0gfzzw7ltmk7y68ks5ekqh4cpx9w Change: 1 └─ Vault: internal_vault_sim1tpsesv77qvw782kknjks9g3x2msg8cc8ldshk28pkf6m6lkhun3sel ResAddr: resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3 Change: 0.215801938855 New Entities: 0 This reads as: Caption Description Transaction status Transaction was successful. Transaction Fee and Cost Units This sections describes the fees you had to pay to execute this transaction. Read more about fees here (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/costing-and-limits/README.md) . Logs There was one log message. The message was: "[INFO ] My balance is: 1000 HelloToken. Now giving away a token!" Events The transaction first locked the fees. Then it executed one CallMethod instruction, that called the free_token method which in turn issued another CallMethod instruction to withdraw and store the HelloToken in your account. Outputs and Balance Changes Information about resource movement is displayed here You'll now have a shiny new HelloToken in your account, and your Hello component has one less. - We can check this by using resim show on our account and Hello components. resim show resim show Final Notes - If you make changes to the structs within your code, then unfortunately you will have to run through the entire publish-instantiate-call flow from scratch, saving the new addresses as they appear. If you only make implementation changes then it is possible to update your package with: resim publish . --package-address - At any point you can instantly get a clean slate in the simulator by running: resim reset You almost certainly need to do this if you switch to working on a different project. Well done. You've finished this introductory tutorial on using resim for Scrypto. You can continuing your learning journey with our explanation of the package code in the next section, Explain Your First Scrypto Project (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/learning-to-explain-your-first-scrypto-project.md) . Let us know if you find any section helpful or not by clicking one of the buttons below ⬇. You can also let us know about a typo or outdated information using the same buttons. ## Learning Step-by-Step URL: https://radix.wiki/developers/legacy-docs/build/learning-step-by-step/learning-step-by-step Updated: 2026-02-18 Summary: We've made two learning paths to teach you how to build on Radix, one for if you want to make full stack apps(#scrypto-and-front-end) (using Scrypto for on ledg Build > Learning Step-by-Step — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) We've made two learning paths to teach you how to build on Radix, one for if you want to make full stack apps (#scrypto-and-front-end) (using Scrypto for on ledger components and JS for the front end) and another for if you want to build and connect a front end web app to existing scrypto components (#front-end) . Scrypto If you’re only planning to build on ledger (in Scrypto) then follow the full stack path (#scrypto-and-front-end) . Sections covering front end interactions are marked as skippable. Scrypto and Front End This step-by-step guide will build you up from the very basics of Scrypto all the way through to a full application with a frontend. It is intended to be followed from start to end, but if you’re just looking to achieve specific tasks you can browse the steps by topic below. The basics of Scrypto package structure and how to use it Step Title Topic 0 Getting Rust & Scrypto (getting-rust-scrypto) Installing the Scrypto toolchain 1 Run Your First Scrypto Project (learning-to-run-your-first-scrypto-project) Running Scrypto code in the Radix Engine Simulator 2 Explain Your First Scrypto Project (learning-to-explain-your-first-scrypto-project) Scrypto package structure and template code explanation An introduction to resource metadata and behaviours Step Title Topic 3 Create Your First Custom Resource (learning-to-create-your-first-custom-resource) Create resources with different metadata 4 Build a Gumball Machine (learning-to-build-a-gumball-machine) Build a simple Gumball Machine blueprint 5 Give the Gumball Machine an Owner (learning-to-give-the-gumball-machine-an-owner) Ownership and protected methods 6 Make the Gumball Machine Refillable (learning-to-make-your-gumball-machine-refillable) The mintable resource behaviour Interact with your package on the test network Step Title Topic 7 Create and Use Transaction Manifests (learning-to-create-and-use-transaction-manifests) Use transaction manifests in the Radix Engine Simulator 8 Use the Gumball Machine on Stokenet (learning-to-use-the-gumball-machine-on-stokenet) Deploy a package on the Stokenet test network 9 Run Your First Front End dApp (learning-to-run-your-first-front-end-dapp) Deploy and run a simple front end dapp on Stokenet 10 Run the Gumball Machine Front End dApp (learning-to-run-the-gumball-machine-front-end-dapp) Deploy and run the Gumball Machine as a front end dapp on Stokenet 11 Set Verification Metadata (learning-to-set-verification-metadata) Two way linking metadata for verification Non-fungible resources, authorization and advanced resource behaviours Step Title Topic 12 Create Your First Non-Fungible (learning-to-create-your-first-non-fungible) Mint a non-fungible resource 13 Build a Candy Store (learning-to-build-a-candy-store) Custom authorization roles 14 Make Recallable Badges (learning-to-make-recallable-badges) The recallable and burnable resource behaviours Interaction between blueprints Step Title Topic 15 Build a Multi-Blueprint Package (learning-to-build-a-multi-blueprint-package) Modularise packages into smaller blueprints 16 Create Owned Components (learning-to-create-owned-components) Compare owned and global components 17 Use External Blueprints (learning-to-use-external-blueprints) Use blueprints that aren’t a part of your package 18 Use External Components (learning-to-use-external-components) Use components that aren’t a part of your package Testing Step Title Topic 19 Explain Your First Test (learning-to-explain-your-first-test) Understand and run the tests on the Hello template blueprint 20 Test a Multi-Blueprint Package (learning-to-test-a-multi-blueprint-package) Unit and integration tests DEX example Step Title Topic 21 Run the Radiswap dApp (run-the-radiswap-dapp) Deploy and run a simple two resource DEX Front End If you are only looking to build a front end for a preexisting scrypto package you will gain the most from the following sections: Step Title Topic 9 Run Your First Front End dApp (learning-to-run-your-first-front-end-dapp) Deploy and run a simple front end dapp on Stokenet 10 Run the Gumball Machine Front End dApp (learning-to-run-the-gumball-machine-front-end-dapp) Deploy and run the Gumball Machine as a front end dapp on Stokenet 11 Set Verification Metadata (learning-to-set-verification-metadata) Two way linking metadata for verification 21 Run the Radiswap dApp - Front End (run-the-radiswap-dapp#the-radiswap-front-end) Deploy and run a simple two resource DEX ## Choosing a Code Editor/IDE URL: https://radix.wiki/developers/legacy-docs/build/setting-up-for-scrypto-development/choosing-an-ide Updated: 2026-02-18 Summary: Here you'll find steps for installing the Scrypto toolchain on Linux, macOS and Windows. Troubleshooting suggestions follow after the installation instructions. Build > Setting up for Development > Choosing a Code Editor/IDE — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/choosing-an-ide.md) Here you'll find steps for installing the Scrypto toolchain on Linux, macOS and Windows. Troubleshooting suggestions follow after the installation instructions. Compatibility The table below shows the compatibility between Scrypto and other tools. Scrypto/Resim LLVM/Clang Rust v1.3.1 21 1.92.0 v1.3.0 18 1.81.0 v1.2.0 17 1.77.2 v1.1.0 17 1.75.0 v1.0.0 16 1.72.1 Note Other versions of LLVM and/or Rust may also work, but have not been tested. Install the Scrypto Toolchain To begin working with Scrypto, you need to first prepare your system for Rust development. Then you can install the Scrypto Libraries, Radix Engine Simulator (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/resim-radix-engine-simulator.md) (resim) and command line tools. You can do this by following the manual steps for Windows (getting-rust-scrypto.md#manual-windows-install) , macOS (getting-rust-scrypto.md#manual-macos-install) or Linux (getting-rust-scrypto.md#manual-linux-install) systems, or run our scripts for automated installation (getting-rust-scrypto.md#automated-installation) at the latests scrypto version. Automated Installation Use these scripts to install the full scrypto toolchain on your machine. Windows Invoke-RestMethod 'https://raw.githubusercontent.com/radixdlt/radixdlt-scrypto/refs/heads/main/scrypto-install-scripts/install-scrypto-windows.ps1' | Invoke-Expression macOS curl -fsSL https://raw.githubusercontent.com/radixdlt/radixdlt-scrypto/refs/heads/main/scrypto-install-scripts/install-scrypto-macos.sh | zsh Linux (Debian based distributions) curl -fsSL https://raw.githubusercontent.com/radixdlt/radixdlt-scrypto/refs/heads/main/scrypto-install-scripts/install-scrypto-debian.sh | bash Next steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Manual Windows Install 1. Install LLVM & Rust compiler - Install git by running the git installer for windows (https://gitforwindows.org/) - Enable git long path support: git config --system core.longpaths true - Visit the Visual Studio Downloads page (https://visualstudio.microsoft.com/downloads/?q=build+tools#build-tools-for-visual-studio-2022) : - Download "Build Tools for Visual Studio 2022" - Once the download is complete, open the downloaded file to run the installer. - In the installer, you will see various workloads you can install. Look for "Desktop development with C++". - Tick the checkbox next to "Desktop development with C++". This will automatically select all the necessary components for C++ development. - After selecting the necessary workload, click the "Install" button at the bottom right of the installer window. - Once the installation is complete, you can close the installer. - To verify the installation, you can a new Command Prompt or PowerShell window and type cl to check if the C++ compiler (cl.exe) is available. - Download and install rustup-init.exe (https://win.rustup.rs/x86_64) - Download and install LLVM from here (https://github.com/llvm/llvm-project/releases/download/llvmorg-21.1.8/LLVM-21.1.8-win64.exe) or check the LLVM GitHub repository (https://github.com/llvm/llvm-project/releases) to find other versions - be sure to tick the option that adds LLVM to the system PATH - Install the required Rust toolchain rustup default 1.92.0 2. Enable cargo in the current shell - Start a new PowerShell 3. Add WebAssembly target rustup target add wasm32-unknown-unknown 4. Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 5. Next Steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Manual macOS Install 1. Install LLVM & Rust compiler - Make sure you have the xcode command line tools by running: xcode-select --install - Install cmake and LLVM brew If you don't already have homebrew installed you will need to follow the instructions here (https://brew.sh/) brew install cmake llvm - Add LLVM to the system path by updating ~/.zshrc and ~/.profile: path_update='export PATH="$(brew --prefix llvm)/bin:$PATH"' echo $path_update >> ~/.zshrc && echo $path_update >> ~/.profile - Install Rust compiler # Replace 1.92.0 with the required version curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- --default-toolchain=1.92.0 2. Enable cargo in the current shell source $HOME/.cargo/env 3. Add WebAssembly target rustup target add wasm32-unknown-unknown 4. Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 5. Next Steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Manual Linux Install 1. Install LLVM & Rust compiler - Make sure a C++ compiler and LLVM is installed: sudo apt install clang build-essential llvm - Install Rust compiler # Replace 1.92.0 with the required version curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- --default-toolchain=1.92.0 2. Enable cargo in the current shell source $HOME/.cargo/env 3. Add WebAssembly target rustup target add wasm32-unknown-unknown 4. Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 5. Next Steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Troubleshooting Windows - Check your clang version clang --version - Check your LLVM/Clang version is compatible with your Rust and Resim versions by looking at the Compatibility Table (docs/getting-rust-scrypto#compatibility) - Install the appropriate LLVM version from the LLVM GitHub repository (https://github.com/llvm/llvm-project/releases) . - Reinstall required Rust toolchain # Replace 1.92.0 with the required version rustup default 1.92.0 - Add WebAssembly target rustup target add wasm32-unknown-unknown - Install Radix Engine Simulator and command-line tools # Replace 1.3.1 with the required version cargo install --force radix-clis@1.3.1 macOS Try the following steps: - Define rust stable: # Replace 1.92.0 with the required version rustup default 1.92.0 - Install LLVM 21 using brew: # Replace 21 with the required version brew install llvm@21 - Confirm the Installation Path: # Replace 21 with the required version brew --prefix llvm@21 **Output:** `/usr/local/opt/llvm@21` - Add the vesion LLVM to the system path: # Replace @21 with the required version path_update='export PATH="$(brew --prefix llvm@21)/bin:$PATH"' echo $path_update >> ~/.zshrc && echo $path_update >> ~/.profile - Open a new terminal window to reload the environment - Check that the installation of LLVM was done correctly by checking the clang version clang --version - Install XCode from the App Store - Reset the XCode config: xcodebuild -runFirstLaunch - Add WebAssembly target rustup target add wasm32-unknown-unknown - Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 macOS Command line tools - Remove command line tools sudo rm -rf /Library/Developer/CommandLineTools - Reinstall command line tools xcode-select install After Installing Scrypto You have now successfully installed the Scrypto toolchain and can start writing your own Scrypto code. To do that you we advise you use an IDE/code-editor Setting up Your IDE/Code Editor An IDE, which stands for Integrated Development Environment, is like a text editor, but with added features to help you while programming. For writing Scrypto code you will need to install an IDE with rust support (because Scrypto is built on top of Rust). We recommend Visual Studio Code where there are rust and Radix Developer Tools extensions to assist. Visual Studio Code To set up Visual Studio Code: - Start by installing VS Code by following the download and install instructions (https://code.visualstudio.com/) . - Install the rust-analyzer and Radix Developer Tools extensions. This will give you syntax highlighting and code suggestion while you write your Scrypto code and manifests: - Open VS Code - Click on the extension icon on the left panel - Search for "rust-analyzer" - Click on install - Search for "Radix Developer Tools" - Click on install You are now ready to start writing your first Scrypto blueprint. What Should You do Next? If your not sure what to do after installing scrypto have a look at our Learning Step-by-Step (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) to learn how to start making Scrypto packages and dapps. More Information - Scrypto CLI (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/scrypto-cli-tool.md) - Radix Engine Simulator (radix-engine-simulator-resim) ## Updating Scrypto URL: https://radix.wiki/developers/legacy-docs/build/setting-up-for-scrypto-development/updating-scrypto Updated: 2026-02-18 Summary: When a new version of Scrypto is released, it is recommended to update your toolchain to benefit from the latest changes. To update to the latest version, follo Build > Setting up for Development > Updating Scrypto — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/updating-scrypto.md) Updating Scrypto to the latest version When a new version of Scrypto is released, it is recommended to update your toolchain to benefit from the latest changes. To update to the latest version, follow these steps: - Open a terminal, or PowerShell if you are on Window - Install the required Rust compiler (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/getting-rust-scrypto.md) and set it as default rustup default - Reinstall the Radix CLIs with: cargo install --force radix-clis - Reset your simulator with: resim reset Updating existing projects When updating an existing project, to recompile with all the latest updates: - delete the existing Cargo.lock file - run cargo clean - run scrypto build ## Getting Rust & Scrypto URL: https://radix.wiki/developers/legacy-docs/build/setting-up-for-scrypto-development/getting-rust-scrypto Updated: 2026-02-18 Summary: Here you'll find steps for installing the Scrypto toolchain on Linux, macOS and Windows. Troubleshooting suggestions follow after the installation instructions. Build > Setting up for Development > Getting Rust & Scrypto — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/getting-rust-scrypto.md) Here you'll find steps for installing the Scrypto toolchain on Linux, macOS and Windows. Troubleshooting suggestions follow after the installation instructions. Compatibility The table below shows the compatibility between Scrypto and other tools. Scrypto/Resim LLVM/Clang Rust v1.3.1 21 1.92.0 v1.3.0 18 1.81.0 v1.2.0 17 1.77.2 v1.1.0 17 1.75.0 v1.0.0 16 1.72.1 Note Other versions of LLVM and/or Rust may also work, but have not been tested. Install the Scrypto Toolchain To begin working with Scrypto, you need to first prepare your system for Rust development. Then you can install the Scrypto Libraries, Radix Engine Simulator (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/resim-radix-engine-simulator.md) (resim) and command line tools. You can do this by following the manual steps for Windows (getting-rust-scrypto.md#manual-windows-install) , macOS (getting-rust-scrypto.md#manual-macos-install) or Linux (getting-rust-scrypto.md#manual-linux-install) systems, or run our scripts for automated installation (getting-rust-scrypto.md#automated-installation) at the latests scrypto version. Automated Installation Use these scripts to install the full scrypto toolchain on your machine. Windows Invoke-RestMethod 'https://raw.githubusercontent.com/radixdlt/radixdlt-scrypto/refs/heads/main/scrypto-install-scripts/install-scrypto-windows.ps1' | Invoke-Expression macOS curl -fsSL https://raw.githubusercontent.com/radixdlt/radixdlt-scrypto/refs/heads/main/scrypto-install-scripts/install-scrypto-macos.sh | zsh Linux (Debian based distributions) curl -fsSL https://raw.githubusercontent.com/radixdlt/radixdlt-scrypto/refs/heads/main/scrypto-install-scripts/install-scrypto-debian.sh | bash Next steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Manual Windows Install 1. Install LLVM & Rust compiler - Install git by running the git installer for windows (https://gitforwindows.org/) - Enable git long path support: git config --system core.longpaths true - Visit the Visual Studio Downloads page (https://visualstudio.microsoft.com/downloads/?q=build+tools#build-tools-for-visual-studio-2022) : - Download "Build Tools for Visual Studio 2022" - Once the download is complete, open the downloaded file to run the installer. - In the installer, you will see various workloads you can install. Look for "Desktop development with C++". - Tick the checkbox next to "Desktop development with C++". This will automatically select all the necessary components for C++ development. - After selecting the necessary workload, click the "Install" button at the bottom right of the installer window. - Once the installation is complete, you can close the installer. - To verify the installation, you can a new Command Prompt or PowerShell window and type cl to check if the C++ compiler (cl.exe) is available. - Download and install rustup-init.exe (https://win.rustup.rs/x86_64) - Download and install LLVM from here (https://github.com/llvm/llvm-project/releases/download/llvmorg-21.1.8/LLVM-21.1.8-win64.exe) or check the LLVM GitHub repository (https://github.com/llvm/llvm-project/releases) to find other versions - be sure to tick the option that adds LLVM to the system PATH - Install the required Rust toolchain rustup default 1.92.0 2. Enable cargo in the current shell - Start a new PowerShell 3. Add WebAssembly target rustup target add wasm32-unknown-unknown 4. Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 5. Next Steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Manual macOS Install 1. Install LLVM & Rust compiler - Make sure you have the xcode command line tools by running: xcode-select --install - Install cmake and LLVM brew If you don't already have homebrew installed you will need to follow the instructions here (https://brew.sh/) brew install cmake llvm - Add LLVM to the system path by updating ~/.zshrc and ~/.profile: path_update='export PATH="$(brew --prefix llvm)/bin:$PATH"' echo $path_update >> ~/.zshrc && echo $path_update >> ~/.profile - Install Rust compiler # Replace 1.92.0 with the required version curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- --default-toolchain=1.92.0 2. Enable cargo in the current shell source $HOME/.cargo/env 3. Add WebAssembly target rustup target add wasm32-unknown-unknown 4. Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 5. Next Steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Manual Linux Install 1. Install LLVM & Rust compiler - Make sure a C++ compiler and LLVM is installed: sudo apt install clang build-essential llvm - Install Rust compiler # Replace 1.92.0 with the required version curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- --default-toolchain=1.92.0 2. Enable cargo in the current shell source $HOME/.cargo/env 3. Add WebAssembly target rustup target add wasm32-unknown-unknown 4. Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 5. Next Steps After Installation Look at After Installing Scrypto (getting-rust-scrypto.md#after-installing-scrypto) for more helpful tools and suggestions Troubleshooting Windows - Check your clang version clang --version - Check your LLVM/Clang version is compatible with your Rust and Resim versions by looking at the Compatibility Table (docs/getting-rust-scrypto#compatibility) - Install the appropriate LLVM version from the LLVM GitHub repository (https://github.com/llvm/llvm-project/releases) . - Reinstall required Rust toolchain # Replace 1.92.0 with the required version rustup default 1.92.0 - Add WebAssembly target rustup target add wasm32-unknown-unknown - Install Radix Engine Simulator and command-line tools # Replace 1.3.1 with the required version cargo install --force radix-clis@1.3.1 macOS Try the following steps: - Define rust stable: # Replace 1.92.0 with the required version rustup default 1.92.0 - Install LLVM 21 using brew: # Replace 21 with the required version brew install llvm@21 - Confirm the Installation Path: # Replace 21 with the required version brew --prefix llvm@21 **Output:** `/usr/local/opt/llvm@21` - Add the vesion LLVM to the system path: # Replace @21 with the required version path_update='export PATH="$(brew --prefix llvm@21)/bin:$PATH"' echo $path_update >> ~/.zshrc && echo $path_update >> ~/.profile - Open a new terminal window to reload the environment - Check that the installation of LLVM was done correctly by checking the clang version clang --version - Install XCode from the App Store - Reset the XCode config: xcodebuild -runFirstLaunch - Add WebAssembly target rustup target add wasm32-unknown-unknown - Install Radix Engine Simulator and command-line tools cargo install --force radix-clis@1.3.1 macOS Command line tools - Remove command line tools sudo rm -rf /Library/Developer/CommandLineTools - Reinstall command line tools xcode-select install After Installing Scrypto You have now successfully installed the Scrypto toolchain and can start writing your own Scrypto code. To do that you we advise you use an IDE/code-editor Setting up Your IDE/Code Editor An IDE, which stands for Integrated Development Environment, is like a text editor, but with added features to help you while programming. For writing Scrypto code you will need to install an IDE with rust support (because Scrypto is built on top of Rust). We recommend Visual Studio Code where there are rust and Radix Developer Tools extensions to assist. Visual Studio Code To set up Visual Studio Code: - Start by installing VS Code by following the download and install instructions (https://code.visualstudio.com/) . - Install the rust-analyzer and Radix Developer Tools extensions. This will give you syntax highlighting and code suggestion while you write your Scrypto code and manifests: - Open VS Code - Click on the extension icon on the left panel - Search for "rust-analyzer" - Click on install - Search for "Radix Developer Tools" - Click on install You are now ready to start writing your first Scrypto blueprint. What Should You do Next? If your not sure what to do after installing scrypto have a look at our Learning Step-by-Step (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) to learn how to start making Scrypto packages and dapps. More Information - Scrypto CLI (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/tools-for-scrypto/scrypto-cli-tool.md) - Radix Engine Simulator (radix-engine-simulator-resim) ## Setting Up URL: https://radix.wiki/developers/legacy-docs/build/setting-up-for-scrypto-development/setting-up-for-dapp-development Updated: 2026-02-18 Summary: You can develop applications for Radix, both frontend and backend, on Mac, Linux, or Windows, using free tools. Build > Setting up for Development > Setting Up — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/setting-up-for-dapp-development.md) You can develop applications for Radix, both frontend and backend, on Mac, Linux, or Windows, using free tools. Please be aware that most of the example content is written for Mac/Linux environments, and Windows users will need to translate some concepts over to their environment, particularly shell script examples. If you haven't yet read through the Essentials (https://github.com/gguuttss/radix-docs/blob/master/essentials/README.md) section, it's a good idea to run through that before returning here to get set up, and it should take you less than 5 minutes to complete. You can also sign up for the Developer Program (https://developers.radixdlt.com/devprogram) to receive communications about the latest development news, earn rewards, and gain priority access to dev events. For frontend and fullstack development take a look at the Developer Quick Start Guides (https://github.com/gguuttss/radix-docs/blob/master/build/developer-quick-start/README.md) and for Radix Engine development in Scrypto you'll be able to get started by setting up Scrypto here (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/getting-rust-scrypto.md) . ## Setting up for Development URL: https://radix.wiki/developers/legacy-docs/build/setting-up-for-scrypto-development/setting-up-for-scrypto-development Updated: 2026-02-18 Summary: Legacy documentation: Setting up for Development Build > Setting up for Development — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/README.md) ## Scrypto (Ledger) URL: https://radix.wiki/developers/legacy-docs/build/developer-quick-start/scrypto-ledger Updated: 2026-02-18 Summary: This topic is still a work in progress Build > Developer Quick Start > Scrypto (Ledger) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/developer-quick-start/scrypto-ledger.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Quickstart Guide for Scrypto In the mean time here are a few things you might be interested in for a full stack dApp. - Find out how to Install the Scrypto Toolchain (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/getting-rust-scrypto.md) - Have a look at the Learning Step-by-Step (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) where the Scrypto and Front End path (../learning-step-by-step/README.md#scrypto-and-front-end) highlights any sections you can ignore if you don't want to worry about a front end. - The main Scrypto docs pages (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/README.md) . - The Rust Scrypto docs (https://docs.rs/scrypto/latest/scrypto/index.html) - The Stokenet Dev Console (https://stokenet-console.radixdlt.com/) , and just in case you skipped past it on your way here the Radix Dashboard page (https://github.com/gguuttss/radix-docs/blob/master/use/radix-dashboard.md) . - The Reference Materials page (/useful-links) is loaded with important links could be useful for developing in Scrypto ## Fullstack dApp (Client, Server and Ledger) URL: https://radix.wiki/developers/legacy-docs/build/developer-quick-start/fullstack-dapp-development Updated: 2026-02-18 Summary: This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Build > Developer Quick Start > Fullstack dApp (Client, Server and Ledger) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/developer-quick-start/fullstack-dapp-development.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Quickstart Guide for a full stack dApp In the mean time here are a few things you might be interested in for a full stack dApp. - Check out the page on ROLA (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/rola-radix-off-ledger-auth.md) Radix Off Ledger Authentication. - Have a look at the Learning Step-by-Step (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) , particularly the lists of lessons for front end development - The Stokenet Dev Console (https://stokenet-console.radixdlt.com/) and just in case you skipped past it on your way here the Radix Dashboard page (https://github.com/gguuttss/radix-docs/blob/master/use/radix-dashboard.md) . - The Radix Engine Toolkit (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/README.md) is packed full of functionality a full stack dev is likely to find useful. - The Reference Materials page (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/useful-links.md) is loaded with important links for things you will need for a full stack dApp - For the frontend check out dApp Frontend Development (https://github.com/gguuttss/radix-docs/blob/master/build/developer-quick-start/dapp-frontend-development.md) and Building a Frontend dApp (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-frontend-dapp.md) - To get started with Scrypto you will need to Install the Scrypto Toolchain (https://github.com/gguuttss/radix-docs/blob/master/build/setting-up-for-scrypto-development/getting-rust-scrypto.md) - You will want to access the Gateway API or Core API Details on those here (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) ## Frontend dApp (Client and Ledger) URL: https://radix.wiki/developers/legacy-docs/build/developer-quick-start/dapp-frontend-development Updated: 2026-02-18 Summary: This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Build > Developer Quick Start > Frontend dApp (Client and Ledger) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/developer-quick-start/dapp-frontend-development.md) This topic is still a work in progress This area is either a placeholder for content which is yet to be written, or has sparse/early content which is still being worked on. Frontend quick start guide In the mean time here are a few things you might be interested in for a dApp frontend. - Check out Building a Frontend dApp (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/building-a-frontend-dapp.md) - Have a look at the Learning Step-by-Step (https://github.com/gguuttss/radix-docs/blob/master/build/learning-step-by-step/README.md) , particularly the lists of lessons for front end development - The Reference Materials page (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/useful-links.md) is loaded with important links for things you will need for a dApp frontend. - The Stokenet Dev Console (https://stokenet-console.radixdlt.com/) and just in case you skipped past it on your way here the Radix Dashboard page (https://github.com/gguuttss/radix-docs/blob/master/use/radix-dashboard.md) . - Radix dApp toolkit documentation on Github (https://github.com/radixdlt/radix-dapp-toolkit#readme) - Radix Connector extension on Github (https://github.com/radixdlt/connector-extension/releases/tag/v1.0.0) if you need a dev version of the connector extension. - Radix Network APIs (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) - The Gumball Machine Example on Github (https://github.com/radixdlt/scrypto-examples/tree/main/full-stack/dapp-toolkit-gumball-machine) ## Developer Quick Start URL: https://radix.wiki/developers/legacy-docs/build/developer-quick-start/developer-quick-start Updated: 2026-02-18 Summary: Legacy documentation: Developer Quick Start Build > Developer Quick Start — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/developer-quick-start/README.md) ## Build URL: https://radix.wiki/developers/legacy-docs/build/build Updated: 2026-02-18 Summary: Legacy documentation: Build Build — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/build/README.md) ## Rust Transaction Building URL: https://radix.wiki/developers/legacy-docs/integrate/rust-libraries/rust-transaction-building Updated: 2026-02-18 Summary: You can use the radix-transactions crate to build transactions. Integrate > Rust Libraries > Rust Transaction Building — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-transaction-building.md) You can use the radix-transactions (https://docs.rs/radix-transactions/latest) crate to build transactions. Other languages - The UniFFI Radix Engine Toolkit (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/README.md) enables building transactions in other languages (e.g. Swift, Kotlin, Go, Python), with a similar API to the below. - The Typescript Radix Engine Toolkit allows building v1 transactions, but does not have support for building v2 transactions as-of December 2024. Choosing the version to build If starting your programmatic integration now, we recommend building V2 transactions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-overview.md) introduced at Cuttlefish. Even if you don't use subintents, they offer a few advantages over V1 transactions, including: - Support for timestamp expiry. - A collission-resistant intent discriminator. - More flexible instructions for assertions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/manifest-instructions.md) . - A more user-friendly builder, which outputs a detailed transaction with hashes. Building for the Wallet In the unlikely circumstance where you are using the Rust manifest builder to build a manifest stub which will be used in the dApp toolkit, note that at Cuttlefish launch, only v1 instructions are supported in the wallet for the sendTransaction action. v2 instructions are supported in the pre-authorization flow (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-transactions/pre-authorizations-and-subintents.md) however. Transaction V2 To build V2 transactions, you can use the TransactionV2Builder (https://docs.rs/radix-transactions/latest/radix_transactions/builder/struct.TransactionV1Builder.html) , along with the ManifestBuilder (https://docs.rs/radix-transactions/latest/radix_transactions/builder/struct.ManifestBuilder.html) . You will need to integrate with the Gateway API (gateway-api) or Core API (core-api) to resolve the current epoch. Support for external signers (e.g. HSMs) is available via implementing the (blocking/non-async) Signer trait. Note the Curves, Keys, Signatures and Hashing (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-curves-keys-signatures-and-hashing.md) guide for the canonical serialization to use for keys and signatures. You will likely want to read the documentation on intent structure (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/intent-structure.md) before beginning. Configuration // The notary can be an ephemeral key (in which case, set notary_is_signatory as false) // or a main signer key (in which case, set notary_is_signatory as true) let notary = ...; let signers = ...; let notary_is_signatory = ...; let tip_percentage = 0; let network = NetworkDefinition::mainnet(); let current_epoch = /* source from Core API or Gateway API */; // Recommended configuration // The transaction will expire at the end of the next epoch (in 5-10 minutes time) let intent_discriminator = rand::thread_rng().gen(); let start_epoch_inclusive = Epoch::of(current_epoch); let end_epoch_exclusive = Epoch::of(current_epoch + 2); Using subintents /* Use an existing signed partial transaction, from e.g. the pre-authorization flow */ let signed_partial_transaction = SignedPartialTransactionV2::from_raw(RawSignedPartialTransaction::from_hex(hex)); let subintent_hash = signed_partial_transaction .prepare(PreparationSettings::latest_ref()) .expect("Child signed partial transaction could not be prepared") .subintent_hash(); /* OR build your own from a subintent manifest */ let signed_partial_transaction = TransactionBuilder::new_partial_v2() .intent_header(IntentHeaderV2 { network_id, start_epoch_inclusive, end_epoch_exclusive, intent_discriminator, min_proposer_timestamp_inclusive, max_proposer_timestamp_exclusive, }) .manifest_builder(|builder| { builder .withdraw_from_account(account, XRD, 10) .take_all_from_worktop(XRD, "xrd") .yield_to_parent_with_name_lookup(|lookup| (lookup.bucket("xrd"),)) }) .sign(&signer_key) .build(); let subintent_hash = signed_partial_transaction.subintent_hash; Building a transaction let DetailedNotarizedTransactionV2 { transaction, raw, object_names, transaction_hashes, } = TransactionBuilder::new_v2() .transaction_header(TransactionHeaderV2 { notary_public_key, notary_is_signatory, tip_basis_points, }) .intent_header(IntentHeaderV2 { network_id, start_epoch_inclusive, end_epoch_exclusive, intent_discriminator, min_proposer_timestamp_inclusive, max_proposer_timestamp_exclusive, }) .add_signed_child("child", signed_partial_transaction) // The manifest_builder method automatically registers the child hash, so you can avoid // avoid having to do `use_child("child", subintent_hash)` in the manifest .manifest_builder( |builder| { builder .yield_to_child("child", ()) .deposit_entire_worktop(account) }, ) .sign(&signer_key) .notarize(¬ary_key) .build(); let transaction_payload_hex = raw.to_bytes(); let transaction_intent_hash = transaction_hashes.transaction_intent_hash; // The transaction id let transaction_id = transaction_intent_hash.to_string(&network); Building a manifest by itself To build just a TransactionManifestV2, you can use the new_v2() method on ManifestBuilder, or to build a SubintentManifestV2, you can use the new_subintent_v2() method. Note that yield_to_parent(..) and verify_parent(..) are only available on subintent manifests: let transaction_manifest = ManifestBuilder::new_v2() .use_child("child", subintent_hash) .lock_standard_test_fee(account) .yield_to_child("child", ()) .build(); let subintent_manifest = ManifestBuilder::new_subintent_v2() .yield_to_parent(()) .build(); Legacy Transaction V1 To build basic V1 transations, you can use the TransactionV1Builder (https://docs.rs/radix-transactions/latest/radix_transactions/builder/struct.TransactionV1Builder.html) , along with the ManifestBuilder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) . You will need to integrate with the Gateway API (gateway-api) or Core API (core-api) to resolve the current epoch. Support for external signers (e.g. HSMs) is available via implementing the (blocking/non-async) Signer trait. Note the Curves, Keys, Signatures and Hashing (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/concepts-curves-keys-signatures-and-hashing.md) guide for the canonical serialization to use for keys and signatures. // The notary can be an ephemeral key (in which case, set notary_is_signatory as false) // or a main signer key (in which case, set notary_is_signatory as true) let notary = ...; let signers = ...; let notary_is_signatory = ...; let tip_percentage = 0; let network = NetworkDefinition::mainnet(); let current_epoch = /* source from Core API or Gateway API */; // Recommended configuration // The transaction will expire at the end of the next epoch (in 5-10 minutes time) let intent_discriminator = rand::thread_rng().gen(); let start_epoch_inclusive = Epoch::of(current_epoch); let end_epoch_exclusive = Epoch::of(current_epoch + 2); let manifest = ManifestBuilder::new_v1() /** add instructions **/ .build(); let builder = TransactionV1Builder::new() .header(TransactionHeaderV1 { network_id: network.id, start_epoch_inclusive, end_epoch_exclusive, nonce: intent_discriminator, notary_public_key: notary.public_key().into(), notary_is_signatory, tip_percentage, }) .manifest(manifest) .multi_sign(&signers) .notarize(¬ary); let raw_transaction = builder.build().to_raw().unwrap(); let transation_bytes_hex = raw_transaction.to_hex(); let transaction_intent_hash = let subintent_hash = raw_transaction .prepare(PreparationSettings::latest_ref()) .expect("Transaction could not be prepared") .transaction_intent_hash(); let transaction_id = transaction_intent_hash.to_string(&network); ## Rust Manifest Builder URL: https://radix.wiki/developers/legacy-docs/integrate/rust-libraries/rust-manifest-builder Updated: 2026-02-18 Summary: You may also wish to read the ManifestBuilder rust docs on docs.rs. Integrate > Rust Libraries > Rust Manifest Builder — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) Examples Examples of the Rust Manifest Builder can be found in the experimental-examples repository (https://github.com/radixdlt/experimental-examples/tree/main/manifest-builder) . You may also wish to read the ManifestBuilder rust docs on docs.rs (https://docs.rs/radix-transactions/latest/radix_transactions/builder/struct.ManifestBuilder.html) . This page will provide information on how to use the Rust ManifestBuilder. The Rust ManifestBuilder is used to easily write transaction manifests without leaving the Rust environment when building your Scrypto package. This page is structured by first teaching you how to import the ManifestBuilder module in your Rust file to begin working with the ManifestBuilder. Then go over a few examples to familiarize the structure of creating transaction manifests with the ManifestBuilder, provide how to generate a manifest String with an .rtm file format for transaction submission, and reference table for each ManifestBuilder method. Importing the Rust ManifestBuilder Every Scrypto package generated from scrypto new-package will contain a “tests” folder with a lib.rs file. This lib.rs file will contain the integration tests for your Scrypto package. The generated file already contains the necessary Scrypto crates imports to begin testing and writing transaction manifests which will look like this: use scrypto::prelude::*; use scrypto_test::{prelude::*, utils::dump_manifest_to_file_system}; use hello_token::test_bindings::*; Particularly, with the ManifestBuilder module imported, we can now easily create transaction manifests. Writing Manifests with the ManifestBuilder We’ll provide a few examples to help familiarize the general structure of writing manifests with the ManifestBuilder. However, it’s worth noting that the general structure will consist of: - Creating a new instance of the ManifestBuilder using ManifestBuilder::new() - Chaining together a series of instructions for the transaction manifest. - Calling .build() as the last method to build the transaction manifest. Calling a Function and Instantiating a Package let manifest = ManifestBuilder::new() // #1 .lock_fee_from_faucet() .call_function( // #2 package_address, "Hello", "instantiate_hello", manifest_args!(), // #3 ) .build(); // #4 - To create a new instance of the ManifestBuilder. - The call_function method is one of several methods the ManifestBuilder provides to assist us with writing manifest instructions. - The manifest_args!() is a macro which we can use to conveniently pass in function and method arguments. As of RCnet v3, it is optional, and a tuple (…​,) can be used for arguments instead. Be careful if using tuples that singleton tuples will require a trailing comma. - At the end of the list of manifest instructions, we call the .build() method to generate a TransactionManifest. Method Calls let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_method( // #1 account_address, "withdraw", ( resource_address, // #2 amount, ) ) .call_method( // #3 account_address, "deposit_batch", ( ManifestExpression::EntireWorktop, // #4 ) ) .build(); - call_method is another method ManifestBuilder provides. This method is an instruction to perform a method call. - We can pass in several arguments for our method call with different types with each argument separated by commas. - We can chain multiple manifest instructions together within a TransactionManifest. - We can pass expressions as arguments as well. More info here (specifications) Special Account Methods let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( // #1 account_address, resource_address, amount ) .try_deposit_batch_or_abort(account_address) // #2 .build(); - The ManifestBuilder also provides several methods to easily call account component methods. - try_deposit_batch_or_abort is also another convenient account method we can use. The call_method provided by the ManifestBuilder allows us to flexibly write component method call instructions. However, accounts on Radix are native and the ManifestBuilder gives us special methods as a convenience to call account component methods. Using the Worktop The ManifestBuilder has several worktop methods we can use to account and manage asset flows between component method calls. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .take_all_from_worktop( // #1 resource_address, "bucket" // #2 ) .deposit(account_address, "bucket") .build(); - take_all_from_worktop is one of several methods the ManifestBuilder offers to allow us to easily work with assets returned to the worktop from component method calls. - The take_all_from_worktop method has a second argument which requires us to pass a name for the Bucket which will hold our resource to create a named Bucket. Resolving Named Buckets and Proofs Using the ManifestBuilder provides convenient methods to retrieve resources from the worktop or pop a Proof from the AuthZone to create named buckets and proofs. These are named because we want to refer to them later on by passing them as arguments to methods. Some methods provided by the ManifestBuilder conveniently allow you to pass named buckets and proofs that were previously created. For example, we can withdraw a resource from an account, take the resource from the worktop, and deposit it into an account like so: let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .take_all_from_worktop(resource_address, "bucket") .deposit(account_address, "bucket") // #1 .build(); - The deposit method provides clean and convenient way to resolve named buckets. However, particularly when it comes to passing arguments to a call_method or call_function, it’s not as convenient. Therefore, we need to use a lookup function to resolve these named buckets and proofs by using with_name_lookup methods. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .take_all_from_worktop( resource_address, "payment_bucket" ) .call_method_with_name_lookup( // #1 component_address, "buy_gumball", |lookup| ( // #2 lookup.bucket("payment_bucket"), // #3 ) ) .deposit_batch(account_address) .build(); - Using call_method_with_name_lookup allows us to pass named argument such as the named "payment_bucket" we created. - lookup is an arbitrary variable we’re using to create our callback function which allows us to resolve our named buckets and proofs. - By using lookup.bucket() and passing "payment_bucket", we can pass it as an argument and resolve the Bucket we created earlier. Using the AuthZone The AuthZone has similar mechanics to the worktop where there are things pushed to this layer, except, instead of resources, they are proofs. Most Proof creation methods are automatically pushed to the worktop and popping a Proof, from the AuthZone requires the Proof to be named, much like taking resources from the worktop and into a Bucket. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .pop_from_auth_zone("proof") // #1 .call_method_with_name_lookup( component_address, "method_requiring_named_proof", |lookup| ( lookup.proof("proof"), ) ) .deposit_batch(account_address) .build(); - Similar to taking resources from the worktop and placing them in a named Bucket, popping a Proof from the AuthZone will require you to name it (e.g "proof"). Creating Named Proofs The ManifestBuilder offers instructions to create named proofs. These named proofs are not automatically pushed to the AuthZone and must manually be pushed to the AuthZone or be passed into an argument of a method call. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .create_proof_from_auth_zone_of_amount( resource_address, dec!(1), "proof" // #1 ) .push_to_auth_zone("proof") // #2 .build(); - Creating a named Proof called "proof". - Manually pushing the named "proof" to the AuthZone. Common Transaction Manifest Instructions Transferring Tokens Between Accounts let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(100) ) .call_method( // #1 account_address2, "try_deposit_batch_or_abort", manifest_args!( ManifestExpression::EntireWorktop ) ) .build(); - Previously, we’ve been using .deposit_batch method to conveniently deposit any resources from the worktop to an account. However, since accounts are also components, we can use .call_method to interact with accounts (accounts are components on Radix after all). Transferring Tokens to Multiple Accounts We want to withdraw tokens (resources) from an account, equally split them, and send them to two accounts. To do that, we can use the .take_from_worktop method to specify how many tokens we would like to deposit into each account. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( // #1 account_address, resource_address, dec!(200) ) .take_from_worktop( // #2 resource_address, dec!(100), "bucket1" ) .take_all_from_worktop( // #3 resource_address, "bucket2" ) .try_deposit_or_abort( // #4 account_1_address, "bucket1" ) .try_deposit_or_abort( account_2_address, "bucket2" ) .build(); - We are withdrawing 200 tokens from an account to the worktop. - The first .take_from_worktop instruction takes 100 of the 200 tokens on the worktop and puts them into a Bucket named bucket1. - Since we know the remainder of tokens on the worktop, we can simply use .take_all_from_worktop and place in a Bucket named bucket2. - Then we can start depositing the buckets into the other accounts passing our named buckets in with the account address we wish to send to. Exporting TransactionManifest as a String to an .rtm File Format Building a transaction on the Radix Network require a transaction manifest to describe the intent of resource movement between component. Therefore, the scrypto_unit module provides a convenient function: dump_manifest_to_file_system, to generate a transaction manifest file as a .rtm file format. Additionally, using this function also comes with a convenient feature to statically validate the transaction manifest before it is converted to a manifest String. let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .try_deposit_batch_or_abort(account_address); // This generates the .rtm file of the Transaction Manifest. dump_manifest_to_file_system( // #1 manifest.object_names(), // #2 &manifest.build(), // #3 "./transaction-manifest", // #4 Some("manifest_name"), // #5 &NetworkDefinition::simulator() // #6 ).err(); // #6 - We are using dump_manifest_to_file_system to generate a transaction manifest String as an .rtm file based on the reference of the manifest we created. - Calling object_names returns ManifestObjectNames to provide the function tracking of named buckets and proofs (if available). - Takes a reference of the built TransactionManifest. - Specifying the path directory where this .rtm file will be generated, if the directory does not exist, then it will be created. - Specifying the Transaction Manifest name. - Specifying the network which entity addresses will be encoded to. Please see addressing page (addressing-on-radix) for reference. - Generates an error if the transaction manifest is statically invalid. Calling the dump_manifest_to_file_system function will generate an .rtm file in the ./transaction-manifest directory which will look like this: CALL_METHOD Address("component_sim1cptxxxxxxxxxfaucetxxxxxxxxx000527798379xxxxxxxxxhkrefh") "lock_fee" Decimal("5000") ; CALL_METHOD Address("account_sim1c8ng5f2pmcxart0t5y9gftcymuzpkaytavy852mx74txkqamfp9y8w") "withdraw" Address("resource_sim1tknxxxxxxxxxradxrdxxxxxxxxx009923554798xxxxxxxxxakj8n3") Decimal("10") ; CALL_METHOD Address("account_sim1c8ng5f2pmcxart0t5y9gftcymuzpkaytavy852mx74txkqamfp9y8w") "try_deposit_batch_or_abort" Expression("ENTIRE_WORKTOP") ; The entity addresses encoded in your .rtm file will look different based on the addresses inputted and the network specified to be encoded to. Rust ManifestBuilder Methods The list below provides the method name, summary and example for each of the methods the Rust ManifestBuilder supports. If you feel more at home in Rust Docs you can check out the ManifestBuilder here (https://docs.rs/scrypto-test/latest/scrypto_test/prelude/struct.ManifestBuilder.html) . Account, Identity, AccessController new_account Creates a new account native component. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .new_account() .build(); new_account_advanced Creates a new account with specified OwnerRole. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .new_account_advanced( OwnerRole::Updatable( rule!(resource_address) ) ) .build(); withdraw_from_account Withdraws a specified fungible resource from an account component and places into the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .deposit_batch(account_address) .build(); withdraw_non_fungibles_from_account Withdraws a specified non-fungible resource from an account component and places into the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_non_fungibles_from_account( account_address, resource_address, indexset!(NonFungibleLocalId::integer(1), NonFungibleLocalId::integer(2)) ) .deposit_batch(account_address) .build(); burn_in_account Burns a fungible resource with a burnable resource behavior within the account. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .burn_in_account( account_address, resource_address, dec!(1) ) .build(); deposit_batch Drains all resources from the worktop and deposits all into a specified account component. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .deposit_batch(account_address) .build(); try_deposit_batch_or_abort Attempts to deposit a Bucket of resource to an account or aborts the transaction if the account does not allow the resource to be deposited. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .try_deposit_batch_or_abort(account_address) .build(); try_deposit_batch_or_refund Attempts to deposit a Bucket of resource to an account or returns the Bucket of resource to the originator if the account does not allow the resource to be deposited. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .try_deposit_batch_or_refund(account_address) .build(); create_identity_advanced Creates an identity native component with an owner role configuration. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_identity_advanced( OwnerRole::Fixed( rule!(require(resource_address) ) ) .build(); create_identity Creates an identity native component. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_identity() .deposit_batch(account_address) .build(); create_access_controller Creates an access controller native component with the controlled resource and specified authority roles. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .take_from_worktop( resource_address, dec!(1), "bucket" ) .create_access_controller( "bucket", rule!(require(primary_role_bage)), rule!(require(recovery_role_badge)), rule!(require(confirmation_role_bage)), None ) .build(); Call Function and Call Method call_function Calls a function where the arguments should be an array of encoded Scrypto value. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_function( package_address, "Hello", "instantiate_hello", manifest_args!() ) .build(); call_function_with_name_lookup Calls a function with a lookup callback function to resolve named Bucket or Proof. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .pop_from_auth_zone("proof") .call_function_with_name_lookup( package_address, "ExampleBlueprint", "instantiate", |lookup| ( lookup.proof("proof"), ) ) .deposit_batch(account_address) .build(); call_method Calls a component method Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .call_method( component_address, "mint", manifest_args!() ) .deposit_batch(account_address) .build(); call_method_with_name_lookup Calls a method with a lookup callback function to resolve named Bucket or Proof. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .pop_from_auth_zone("proof") .call_method_with_name_lookup( component_address, "method_requiring_named_proof", |lookup| ( lookup.proof("proof"), ), ) .deposit_batch(account_address) .build(); Worktop take_all_from_worktop Take all of a specified resource from the worktop and place into a Bucket. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .take_all_from_worktop(resource_address, "bucket") .deposit(account_address, "bucket") .build(); take_from_worktop Takes a resource from a worktop and place into a Bucket. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .take_from_worktop( resource_address, dec!(10), "bucket") .deposit(account_address, "bucket") .build(); take_non_fungibles_from_worktop Takes specified non-fungibles from the worktop and puts into a Bucket Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_non_fungibles_from_account( account_address, resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .take_non_fungibles_from_worktop( resource_address, indexset!(NonFungibleLocalId::integer(1)), "bucket" ) .deposit(account_address, "bucket") .build(); return_to_worktop Returns a resource back to the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .take_all_from_worktop( resource_address, "bucket" ) .return_to_worktop("bucket") .deposit_batch(account_address) .build(); assert_worktop_contains Asserts the worktop contains a particular resource. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .assert_worktop_contains( resource_address, dec!(10) ) .deposit_batch(account_address) .build(); assert_worktop_contains_any Asserts the worktop contains a specified resource of any amount. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(10) ) .assert_worktop_contains_any(resource_address) .deposit_batch(account_address) .build(); assert_worktop_contains_non_fungibles Asserts the worktop contains specified non-fungibles. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_non_fungibles_from_account( account_address, resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .assert_worktop_contains_non_fungibles( resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .deposit_batch(account_address) .build(); burn_from_worktop Burns a Bucket of resource with a burnable resource behavior. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .take_from_worktop( resource_address, dec!(1), "bucket" ) .burn_resource("bucket") .build(); burn_all_from_worktop Burns all resources in the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(100) ) .burn_all_from_worktop(resource_address) .build(); burn_non_fungible_from_worktop Burns a non-fungible resource from worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_non_fungibles_from_account( account_address, resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .burn_non_fungible_from_worktop(non_fungible_global_id); AuthZone pop_from_auth_zone Pops the last Proof entered to the AuthZone. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .pop_from_auth_zone("proof") .drop_proof("proof") .build(); push_to_auth_zone Pushes a named Proof back into the AuthZone. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .take_from_worktop( resource_address, dec!(1), "bucket" ) .create_proof_from_bucket_of_amount( "bucket", dec!(1), "proof" ) .push_to_auth_zone("proof") .call_method( component_address, "authorized_method", manifest_args!() ) .pop_from_auth_zone("popped_proof") .drop_proof("popped_proof") .deposit(account_address, "bucket") .build(); clear_auth_zone Clears the AuthZone of all proofs. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .clear_auth_zone() .build(); Proof create_proof_from_bucket_of_amount Creates a specified number of proofs from a named Bucket containing a fungible resource. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .take_from_worktop( resource_address, dec!(1), "bucket" ) .create_proof_from_bucket_of_amount( "bucket", dec!(1), "proof" ) .call_method_with_name_lookup( component_address, "method_requiring_named_proof", |lookup| ( lookup.proof("proof"), ), ) .deposit(account_address, "bucket") .build(); create_proof_from_bucket_of_non_fungibles Create a Proof from a named Bucket containing a non-fungible resource. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_non_fungibles_from_account( account_address, resource_address, indexset!( NonFungibleLocalId::integer(1), ) ) .take_non_fungibles_from_worktop( resource_address, indexset!( NonFungibleLocalId::integer(1) ), "bucket" ) .create_proof_from_bucket_of_non_fungibles( "bucket", indexset!( NonFungibleLocalId::integer(1) ), "proof" ) .call_method_with_name_lookup( component_address, "method_requiring_named_proof", |lookup| ( lookup.proof("proof"), ), ) .deposit(account_address, "bucket") .build(); create_proof_from_bucket_of_all Create a composite Proof of all amount of resource contained within a Bucket. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, resource_address, dec!(1) ) .take_all_from_worktop( resource_address, "bucket" ) .create_proof_from_bucket_of_all( "bucket", "proof" ) .call_method_with_name_lookup( component_address, "method_requiring_named_proof", |lookup| ( lookup.proof("proof"), ), ) .deposit(account_address, "bucket") .build(); clone_proof Clones a named Proof. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .pop_from_auth_zone("proof") .clone_proof( "proof", "cloned_proof" ) .clear_auth_zone() .drop_all_proofs() .build(); drop_proof Drops a single named Proof. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .pop_from_auth_zone("proof") .drop_proof("proof") .build(); drop_all_proofs Drops all named proofs. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(2) ) .drop_all_proofs() .build(); create_proof_from_account_of_amount Creates a specified number of proof's of a specified resource from an account component and push to the AuthZone. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account( account_address, resource_address, dec!(1) ) .build(); create_proof_from_account_of_non_fungibles Creates a Proof for each specified non-fungibles. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_non_fungibles( account_address, resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .build(); create_proof_from_auth_zone_of_amount Creates a named Proof of an existing Proof from the AuthZone. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .create_proof_from_auth_zone_of_amount( resource_address, dec!(1), "proof" ) .push_to_auth_zone("proof") .call_method( component_address, "authorized_method_requiring_two_proofs", manifest_args!() ) .clear_auth_zone() .build(); create_proof_from_auth_zone_of_non_fungibles Create a named Proof from a specified non-fungible Proof that is currently in the AuthZone. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_non_fungibles( account_address, resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .create_proof_from_auth_zone_of_non_fungibles( resource_address, indexset!(NonFungibleLocalId::integer(1)), "proof" ) .push_to_auth_zone("proof") .call_method( component_address, "authorized_method_requiring_two_proofs", manifest_args!() ) .clear_auth_zone() .build(); create_proof_from_auth_zone_of_all Creates a named Proof from all proofs that are currently in the AuthZone. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .create_proof_from_auth_zone_of_all( resource_address, "proof" ) .push_to_auth_zone("proof") .call_method( component_address, "authorized_method_requiring_two_proofs", manifest_args!() ) .clear_auth_zone() .build(); Resources and Badges create_fungible_resource Create a fungible resource with configuration for divisibility, metadata, roles, and an optional initial supply. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_fungible_resource( OwnerRole::None, false, 0u8, FungibleResourceRoles { mint_roles: mint_roles!( minter => rule!(allow_all); minter_updater => rule!(deny_all); ), burn_roles: None, freeze_roles: None, recall_roles: None, withdraw_roles: None, deposit_roles: None }, Default::default(), Some(dec!(1000)) ) .deposit_batch(account_address) .build(); create_non_fungible_resource Create a non-fungible resource with configuration for NonFungibleLocalIdType, metadata, roles, and an optional initial supply. First declare the struct NFT and specify the arguments. Example #[derive(ScryptoSbor, NonFungibleData, ManifestSbor)] pub struct NFTData { name: String, ... } let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_non_fungible_resource( OwnerRole::None, NonFungibleIdType::Integer, false, NonFungibleResourceRoles { mint_roles: mint_roles!( minter => rule!(allow_all); minter_updater => rule!(deny_all); ), burn_roles: None, freeze_roles: None, recall_roles: None, withdraw_roles: None, deposit_roles: None, non_fungible_data_update_roles: None }, Default::default(), Some( [( NonFungibleLocalId::integer(1), NFTData { name: "Bob".to_owned(), } )] ) ) .deposit_batch(account_address) .build(); To create a non-fungible resource with no initial supply Example #[derive(ScryptoSbor, NonFungibleData, ManifestSbor)] pub struct NFTData { name: String, key_image_url: Url, ... } ... let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_non_fungible_resource::, NFTData>( OwnerRole::None, NonFungibleIdType::Integer, false, NonFungibleResourceRoles { mint_roles: mint_roles!( minter => rule!(allow_all); minter_updater => rule!(deny_all); ), burn_roles: None, freeze_roles: None, recall_roles: None, withdraw_roles: None, deposit_roles: None, non_fungible_data_update_roles: None }, Default::default(), None ) .deposit_batch(account_address) .build(); create_ruid_non_fungible_resource Create a non-fungible resource with a NonFungibleLocalId::RUID, metadata, roles, and an optional initial supply. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_ruid_non_fungible_resource( OwnerRole::None, false, Default::default(), NonFungibleResourceRoles { mint_roles: mint_roles!( minter => rule!(allow_all); minter_updater => rule!(deny_all); ), burn_roles: None, freeze_roles: None, recall_roles: None, withdraw_roles: None, deposit_roles: None, non_fungible_data_update_roles: None }, Some([Nft { name: "Bob".to_owned() }]) ) .deposit_batch(account_address) .build(); new_token_mutable Creates a fungible resource with a mutable supply. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .new_token_mutable( Default::default(), AccessRule::AllowAll ) .build(); new_token_fixed Creates a fungible resource with a fixed supply. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .new_token_fixed( OwnerRole::None, Default::default(), dec!(1000) ) .deposit_batch(account_address) .build(); new_badge_mutable Creates a badge resource with an updatable OwnerRole and specified initial supply. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .new_badge_mutable( Default::default(), AccessRule::AllowAll ) .build(); new_badge_fixed Creates a badge resource with a specified OwnerRole and initial supply. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .new_badge_fixed( OwnerRole::None, Default::default(), dec!(1) ) .deposit_batch(account_address) .build(); mint_fungible Mints a fungible resource of a specified amount and places into the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .mint_fungible( resource_address, dec!(100) ) .deposit_batch(account_address) .build(); mint_non_fungible Mints a non-fungible resource with a specified non-fungible id and non-fungible data then places into the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .mint_non_fungible( resource_address,[(NonFungibleLocalId::integer(1), Nft { name: "Bob".into() })]) .deposit_batch(account_address) .build(); mint_ruid_non_fungible Mints a RUID id type non-fungible resource with a specified non-fungible data and pre-determined non-fungible id then places into the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .mint_ruid_non_fungible( resource_address, [Nft { name: "Bob".into() }] ) .deposit_batch(account_address) .build(); recall Retrieves a fungible resource with a recallable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .recall( vault_id, dec!(1) ) .deposit_batch(account_address) .build(); recall_non_fungibles Retrieves a non-fungible resource with a recallable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .recall_non_fungibles( vault_id, indexset!(NonFungibleLocalId::integer(1)) ) .deposit_batch(account_address) .build(); freeze_withdraw Freezes withdrawal of a resource with a freezable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .freeze_withdraw(vault_id) .build(); unfreeze_withdraw Unfreezes withdrawal of a resource with a freezable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .unfreeze_withdraw(vault_id) .build(); freeze_deposit Freezes deposit of a resource with a freezable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .freeze_deposit(vault_id) .build(); unfreeze_deposit Unfreezes deposit of a resource with a freezable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .unfreeze_deposit(vault_id) .build(); freeze_burn Freezes burn functionality of a resource with a freezable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .freeze_burn(vault_id) .build(); unfreeze_burn Unfreezes burn functionality of a resource with a freezable resource behavior from a specified Vault. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .unfreeze_burn(vault_id) .build(); Lock Fee lock_fee Locks a specified amount from an account’s Vault with XRD for fee payment. Example let manifest = ManifestBuilder::new() .lock_fee( account_address, dec!(1) ) .build(); lock_standard_fee Locks the standard testing fee from the account’s Vault with XRD. Example let manifest = ManifestBuilder::new() .lock_standard_test_fee(account_address) .build(); lock_fee_from_faucet Locks the standard testing fee from the system faucet. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .build(); lock_fee_and_withdraw Locks a specified amount of XRD from an account for fee payment and withdraw a specified resource and amount to the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_and_withdraw( account_address, dec!(1), resource_address, dec!(10) ) .deposit_batch(account_address) .build(); lock_fee_and_withdraw_non_fungibles Locks a specified amount of XRD from an account for fee payment and withdraw specified non-fungibles to the worktop. Example let manifest = ManifestBuilder::new() .lock_fee_and_withdraw_non_fungibles( account_address, dec!(10), resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .deposit_batch(account_address) .build(); Validator stake_validator Stakes a Bucket of XRD to a validator. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, RADIX_TOKEN, dec!(100) ) .take_all_from_worktop( RADIX_TOKEN, "bucket" ) .stake_validator( validator_address, "bucket" ) .build(); unstake_validator Unstakes a Bucket of XRD from a validator. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .withdraw_from_account( account_address, stake_units, dec!(100) ) .take_all_from_worktop( stake_units, "bucket" ) .unstake_validator( validator_address, "bucket" ) .build(); Royalty claim_package_royalty Claim royalty from a blueprint package. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_non_fungibles( account_address, resource_address, indexset!(NonFungibleLocalId::integer(1)) ) .claim_package_royalties(package_address) .deposit_batch(account_address) .build(); set_component_royalty Updates a component method royalty configuration. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .set_component_royalty( component_address, "mint", RoyaltyAmount::Xrd(dec!(1)) ) .build(); lock_component_royalty Lock a component’s method royalty configuration. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .lock_component_royalty( component_address, "mint", ) .build(); claim_component_royalty Claim royalty from a global component. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .claim_component_royalties(component_address) .deposit_batch(account_address) .build(); Role set_owner_role Configures an owner role of a global entity (e.g PackageAddress, ComponentAddress, ResourceAddress). Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .set_owner_role( component_address, rule!(require(resource_address)) ) .build(); update_role Updates a role’s AccessRule of a specified global entity. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .update_role( component_address, ObjectModuleId::Main, RoleKey::from("admin"), rule!(require(resource_address)) ) .build(); get_role Retrieve’s a role’s AccessRule of a specified global entity. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .get_role( component_address, ObjectModuleId::Main, RoleKey { key: "admin".to_string() } ) .build(); Metadata set_metadata Sets a metadata field of a global entity (e.g PackageAddress, ComponentAddress, ResourceAddress). Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .set_metadata( component_address, "name", MetadataValue::String("HelloComponent".into()) ) .build(); lock_metadata Lock’s a global entity’s metadata from being updated. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .lock_metadata(component_address, "name"); freeze_metadata Freezes a global entity’s metadata from being updated. Example let manifest = ManifestBuilder::new() .lock_fee_from_faucet() .create_proof_from_account_of_amount( account_address, resource_address, dec!(1) ) .freeze_metadata( component_address, "name", ) .build(); Publish Package publish_package Publish a blueprint package. Example let (code, definition) = Compile::compile(this_package!()); let manifest = ManifestBuilder::new() .publish_package(code, definition) .deposit_batch(account_address) .build(); publish_package_advanced Publishes a blueprint package with custom configuration. Example let (code, definition) = Compile::compile(this_package!()); let manifest = ManifestBuilder::new() .publish_package_advanced( None, code, definition, metadata_init!(), OwnerRole::Updatable(rule!(require(resource_address))) ) .build(); publish_package_with_owner Publish a blueprint package with a specified owner badge. Example let (code, schema) = Compile::compile(this_package!()); let manifest = ManifestBuilder::new() .publish_package_with_owner( code, schema, NonFungibleGlobalId::new( resource_address, NonFungibleLocalId::integer(1) ) ) .build(); ## Rust Libraries Overview URL: https://radix.wiki/developers/legacy-docs/integrate/rust-libraries/rust-libraries-overview Updated: 2026-02-18 Summary: Rust is used as a key part of the Radix stack, and is published to crates.io with each release. Integrate > Rust Libraries > Rust Libraries Overview — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-libraries-overview.md) Rust is used as a key part of the Radix stack, and is published to crates.io with each release. There are a few Radix libraries which might be useful for integration or reference: - The Engine Rust Libraries - which power the Radix Engine in the node, resim and resim - The Radix Engine Toolkit core - Wallet-compatible derivation - Sargon Engine Rust Crates These crates are published to crates.io at each protocol update. All these crates inter-depend, so you must ensure you use the same version across all imported crates. These crates don't use standard semver - we recommend explicitly fixing the version for each protocol update, and manually updating. Key libraries include: - scrypto (https://crates.io/crates/scrypto) ( docs (https://docs.rs/scrypto/latest) ) and scrypto-test (https://crates.io/crates/scrypto-test) ( docs (https://docs.rs/scrypto-test/latest) ) - The crates used to create and test scrypto applications. - radix-transactions (https://crates.io/crates/radix-transactions) ( docs (https://docs.rs/radix-transactions/latest) ) - Builders and models for Radix transactions. Includes the: - ManifestBuilder (https://docs.rs/radix-transactions/latest/radix_transactions/builder/struct.ManifestBuilder.html) ( article (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) ) - A rust-native tool for building any manifest (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/manifest/README.md) . - TransactionV1Builder (https://docs.rs/radix-transactions/latest/radix_transactions/builder/struct.TransactionV1Builder.html) - For building V1 Notarized Transactions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) - TransactionV2Builder (https://docs.rs/radix-transactions/latest/radix_transactions/builder/struct.TransactionV2Builder.html) - For building V2 Notarized Transactions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) - SignedPartialTransactionV2Builder (https://docs.rs/radix-transactions/latest/radix_transactions/builder/type.SignedPartialTransactionV2Builder.html) - For building V2 Signed Partial Transactions (https://github.com/gguuttss/radix-docs/blob/master/reference/transactions/transaction-structure.md) , the recommended path to build subintents to build up a transaction - AnyTransaction (https://docs.rs/radix-transactions/latest/radix_transactions/model/enum.AnyTransaction.html) - an enum of all transactions and the canonical transaction serialization - AnyManifest (https://docs.rs/radix-transactions/latest/radix_transactions/model/enum.AnyManifest.html) - an enum of all manifests - AnyInstruction (https://docs.rs/radix-transactions/latest/radix_transactions/model/enum.AnyInstruction.html) - a type alias for the latest version of instructions - an enum of all possible instructions. - radix-common (https://crates.io/crates/radix-common) ( docs (https://docs.rs/radix-common/latest) ) and radix-rust (https://crates.io/crates/radix-rust) ( docs (https://docs.rs/radix-rust/latest) ) - Various common Radix traits and types. Most libraries have an expansive prelude which cover most of their key functionality, and is recommended to avoid import paths breaking between different versions. There are also other crates, such as radix-engine, radix-engine-interface and many others. But these aren't typically used as libraries by integrators. Rust Toolkit The toolkit core (https://github.com/radixdlt/radix-engine-toolkit) is built in Rust, but it isn't published to crates.io and is mostly built to be exposed over UniFFI or WASM. It isn't intended to be used directly. We may more formally create a rust toolkit with a more stable API at some point, but for now we recommend using the Engine crates directly (notably, the radix-transactions (https://crates.io/crates/radix-transactions) crate). Wallet compatible derivation The wallet-compatible-derivation (https://github.com/radixdlt/wallet-compatible-derivation) crate and CLI can be used to derive keys from mnemonics in the same manner as the official Radix Wallet, which enables creating a programmatic integration which can also be imported into a Radix wallet. Sargon The sargon (https://github.com/radixdlt/sargon) crate is a UniFFI library which forms the core of the official Android and iOS wallets. It's not intended to be consumed by anyone else, but might form a useful reference. It imports both the engine libraries and toolkit core (https://github.com/radixdlt/radix-engine-toolkit) . Babylon Node The babylon node has a rust-core (https://github.com/radixdlt/babylon-node/tree/main/core-rust) , which might be a useful reference. ## Rust Libraries URL: https://radix.wiki/developers/legacy-docs/integrate/rust-libraries/rust-libraries Updated: 2026-02-18 Summary: Legacy documentation: Rust Libraries Integrate > Rust Libraries — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/README.md) ## Examples URL: https://radix.wiki/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit-examples Updated: 2026-02-18 Summary: The radix-engine-toolkit-examples repository contains examples built with the Radix Engine Toolkit C#, Kotlin, Pytho Integrate > Radix Engine Toolkit > Examples — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-examples.md) The radix-engine-toolkit-examples (https://github.com/radixdlt/experimental-examples/tree/main/radix-engine-toolkit) repository contains examples built with the Radix Engine Toolkit C#, Kotlin, Python, and Swift wrappers to showcase how various frequently asked-about topics can be achieved through the Radix Engine Toolkit wrappers. Each one of the examples has an implementation in all four of the languages that have a UniFFI Radix Engine Toolkit wrapper. A readme file is provided with each of the examples explaining what the example showcases as well as a section on what the reader may learn by going through a particular example. Additionally, each language example comes with a run.sh script that can be called to run that particular example. These run.sh scripts were written for MacOS and Linux and there is no guarantee that they run on Windows. If they do not, then they could be run in the same way as any program written in those languages is run on that particular operating system. This may be more prevalent with Python and how different operating systems have different names for the Python CLI command: py, python, and python3 . The following is the recommended order of examples to go through: - transaction/construction-of-simple-transaction (https://github.com/radixdlt/experimental-examples/tree/main/radix-engine-toolkit/examples/transactions/construction-of-simple-transaction) : This example showcases how simple manifests and transactions can be constructed through the Radix Engine Toolkit with a focus on how the ManifestBuilder and TransactionBuilder can be used. Additionally, it showcases how random private keys can be generated through the secure randomness implementation available in different languages and how a virtual account address can be derived from the public key. - transactions/construction-of-simple-transaction-string-manifests (https://github.com/radixdlt/experimental-examples/tree/main/radix-engine-toolkit/examples/transactions/construction-of-simple-transaction-string-manifests) : This example is identical to the one above except for how the manifest is constructed: instead of using the ManifestBuilder to construct the manifest, a string of the manifest instructions (this is the contents of the .rtm files) is used. This example can be very easily tweaked to read a .rtm file instead of relying on a hardcoded string. - transactions/batch-transfers-from-csv: (https://github.com/radixdlt/experimental-examples/tree/main/radix-engine-toolkit/examples/transactions/batch-transfers-from-csv) This example shows how transfers can be made out of one account, of multiple resources, and into multiple different accounts. A CSV file is used as the source of the transfers to be performed out of the source account. This CSV file contains the destination account, the address of the resources to transfer, and the amount to transfer. The source account is derived from a private key that is hardcoded into the examples. This example shows how a simple CSV file could be used to describe transfers and how a more complex manifest could be built out of the contents of this CSV file. None of the examples provided make actual calls to the Gateway or Core API: the examples solely focus on the Radix Engine Toolkit and everything else is mocked. This is why all of the examples have a MockGatewayApiClient class which offers the correct interface of the Gateway API, but does not have any actual implementation for making HTTP calls to the gateway or anything of that sort. ## Manifest Builder URL: https://radix.wiki/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-manifest-builder Updated: 2026-02-18 Summary: One of the core features offered in the Radix Engine Toolkit(../README.md) is a manifest builder that closely mimics the Rust Manifest Builder(../../rust-librar Integrate > Radix Engine Toolkit > Usage Guide > Manifest Builder — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-manifest-builder.md) One of the core features offered in the Radix Engine Toolkit (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/README.md) is a manifest builder that closely mimics the Rust Manifest Builder (https://github.com/gguuttss/radix-docs/blob/master/integrate/rust-libraries/rust-manifest-builder.md) but that works over a foreign function interface (FFI). Thus, the Radix Engine Toolkit manifest builder is not identical to the one in Rust but quite similar. This article explores what methods the manifest builder has, what differences exist between it and the Rust manifest builder, and provides examples and guidance of how this manifest builder may be used in the various Radix Engine Toolkit wrappers. Methods Before discussing what methods are available in the Radix Engine Toolkit manifest builder, some context should be given to the reader on the various instruction models that exist. The instructions interpreter (the transaction processor) only understands a small handful of instructions. This includes instructions such as CallMethod, CallFunction, TakeFromWorktop, and so on. To the interpreter, there does not exist a CreateAccount, PublishPackage, SetMetadata, or other instructions despite them appearing in .rtm files. The transaction manifest compiler has several instruction aliases defined such as PublishPackage and maps those high-level instructions to their lower-level equivalent that can be processed by the interpreter, the lower-level representation of these instructions is typically a CallMethod or a CallFunction. In a way, the compiler extends the set of instructions the interpreter supports by adding various high-level aliases. The following links contain more information on this: - The list of instructions understood by the instructions interpreter (the transaction processor) can be found here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transactions/src/model/v1/instruction_v1.rs#L526-L701) . - The list of instructions understood by the transaction manifest compiler can be found here (https://github.com/radixdlt/radixdlt-scrypto/blob/main/radix-transactions/src/manifest/ast.rs#L4-L256) . The following diagram shows how all of the various builders and compilers eventually create instructions from the base instruction set that’s understood by the instructions interpreter. The Radix Engine Toolkit manifest builder supports the following instructions: - All of the instructions are understood by the instructions interpreter (the transaction processor). - Instruction aliases for all of the methods and functions of the native components. The Radix Engine Toolkit manifest builder supports all of the base instructions as well as many other instruction aliases. It is important to note that there may be differences between the instruction aliases supported by the Rust and Radix Engine Toolkit manifest builders such as the name, the order of arguments, and so on. The naming scheme adopted for all of the aliases supported by this manifest builder will be: ${blueprint_name}_{method_or_function_name} which can already be seen in aliases such as account_lock_fee. Any aliases that are not supported by this manifest builder may be modeled as regular method or function calls by the client. The following is a full list of all of the methods that exist on the Radix Engine Toolkit manifest builder: List of all Radix Engine Toolkit manifest builder methods Base Instructions - take_all_from_worktop - take_from_worktop - take_non_fungibles_from_worktop - return_to_worktop - assert_worktop_contains_any - assert_worktop_contains - assert_worktop_contains_non_fungibles - pop_from_auth_zone - push_to_auth_zone - drop_auth_zone_proofs - drop_auth_zone_signature_proofs - drop_all_proofs - create_proof_from_auth_zone_of_all - create_proof_from_auth_zone_of_amount - create_proof_from_auth_zone_of_non_fungibles - create_proof_from_bucket_of_all - create_proof_from_bucket_of_amount - create_proof_from_bucket_of_non_fungibles - burn_resource - clone_proof - drop_proof - call_function - call_method - call_royalty_method - call_metadata_method - call_access_rules_method - call_direct_vault_method - allocate_global_address Misc Aliases - create_fungible_resource_manager - mint_fungible Faucet - faucet_free_xrd - faucet_lock_fee Account - account_create_advanced - account_create - account_securify - account_lock_fee - account_lock_contingent_fee - account_deposit - account_try_deposit_or_abort - account_try_deposit_or_refund - account_deposit_batch - account_try_deposit_batch_or_abort - account_try_deposit_batch_or_refund - account_deposit_entire_worktop - account_try_deposit_entire_worktop_or_refund - account_try_deposit_entire_worktop_or_abort - account_withdraw - account_withdraw_non_fungibles - account_lock_fee_and_withdraw - account_lock_fee_and_withdraw_non_fungibles - account_create_proof_of_amount - account_create_proof_of_non_fungibles - account_set_default_deposit_rule - account_set_resource_preference - account_remove_resource_preference - account_burn - account_burn_non_fungibles - account_add_authorized_depositor - account_remove_authorized_depositor Validator - validator_register - validator_unregister - validator_stake_as_owner - validator_stake - validator_unstake - validator_claim_xrd - validator_update_key - validator_update_fee - validator_update_accept_delegated_stake - validator_accepts_delegated_stake - validator_total_stake_xrd_amount - validator_total_stake_unit_supply - validator_get_redemption_value - validator_signal_protocol_update_readiness - validator_get_protocol_update_readiness - validator_lock_owner_stake_units - validator_start_unlock_owner_stake_units - validator_finish_unlock_owner_stake_units Access Controller - access_controller_new_from_public_keys - access_controller_create_with_security_structure - access_controller_create - access_controller_create_proof - access_controller_initiate_recovery_as_primary - access_controller_initiate_recovery_as_recovery - access_controller_initiate_badge_withdraw_as_primary - access_controller_initiate_badge_withdraw_as_recovery - access_controller_quick_confirm_primary_role_recovery_proposal - access_controller_quick_confirm_recovery_role_recovery_proposal - access_controller_quick_confirm_primary_role_badge_withdraw_attempt - access_controller_quick_confirm_recovery_role_badge_withdraw_attempt - access_controller_timed_confirm_recovery - access_controller_cancel_primary_role_recovery_proposal - access_controller_cancel_recovery_role_recovery_proposal - access_controller_cancel_primary_role_badge_withdraw_attempt - access_controller_cancel_recovery_role_badge_withdraw_attempt - access_controller_lock_primary_role - access_controller_unlock_primary_role - access_controller_stop_timed_recovery - access_controller_mint_recovery_badges Identity - identity_create_advanced - identity_create - identity_securify Package - package_publish - package_publish_advanced - package_claim_royalty One Resource Pool - one_resource_pool_instantiate - one_resource_pool_contribute - one_resource_pool_redeem - one_resource_pool_protected_deposit - one_resource_pool_protected_withdraw - one_resource_pool_get_redemption_value - one_resource_pool_get_vault_amount Two Resource Pool - two_resource_pool_instantiate - two_resource_pool_contribute - two_resource_pool_redeem - two_resource_pool_protected_deposit - two_resource_pool_protected_withdraw - two_resource_pool_get_redemption_value - two_resource_pool_get_vault_amount Multi Resource Pool - multi_resource_pool_instantiate - multi_resource_pool_contribute - multi_resource_pool_redeem - multi_resource_pool_protected_deposit - multi_resource_pool_protected_withdraw - multi_resource_pool_get_redemption_value - multi_resource_pool_get_vault_amount Metadata Module - metadata_get - metadata_set - metadata_lock - metadata_remove Role Assignment Module - role_assignment_get - role_assignment_set - role_assignment_set_owner - role_assignment_lock_owner Royalty Module - royalty_set - royalty_lock - royalty_claim Buckets, Proofs, NamedAddresses, and AddressReservations The Radix Engine Toolkit manifest builder follows a no-callbacks approach that allows for buckets, proofs, named-addresses, and address reservations to be created without the need for callbacks. This approach makes the manifest builder easier to use and allows it to work over the foreign function interface offered by the toolkit. Manifest builder methods that create buckets, proofs, named-addresses, or address reservations will be referred to as source methods and sink methods are methods that consume them. As an example, a take_from_worktop is a source manifest builder method since it results in the creation of a new bucket whereas burn_resource is a sink method as it consumes the bucket. The approach followed with transaction transient objects such as buckets, proofs, named addresses, and address reservations in the Radix Engine Toolkit manifest builder is as follows: - Source methods have an argument that controls the name given to the objects. - Transient objects are identified by the names given to them at the source methods when sink methods are called. The following are examples to better aid in explaining this: C# using static RadixEngineToolkit.RadixEngineToolkitUniffiMethods; using RadixEngineToolkit; using Decimal = RadixEngineToolkit.Decimal; const byte networkId = 0x02; var xrd = KnownAddresses(networkId).resourceAddresses.xrd using var address = new Address("account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8"); using var manifest = new ManifestBuilder() .FaucetFreeXrd() // The following is a source method that creates a bucket by taking a specified amount of // resources from the worktop and into a bucket. The name to use for this bucket is `xrdBucket`. // Any following method that wishes to refer to this bucket, it must refer to it using the name // given to it here. .TakeFromWorktop(xrd, new Decimal("10000"), new ManifestBuilderBucket("xrdBucket")) // The following is a sink method that consumes buckets by depositing them into an account. When // wishing to refer to the bucket created by the previous method the same name that the bucket // was created with is used again here. .AccountTryDepositOrAbort(address, new ManifestBuilderBucket("xrdBucket"), null) .Build(networkId); Kotlin import com.radixdlt.ret.* fun main(args: Array) { val networkId: UByte = 0x02u val xrd = knownAddresses(networkId).resourceAddresses.xrd val address = Address("account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8") val manifest = ManifestBuilder() .faucetFreeXrd() // The following is a source method that creates a bucket by taking a specified amount // of resources from the worktop and into a bucket. The name to use for this bucket is // `xrdBucket`. Any following method that wishes to refer to this bucket, it must refer // to it using the name given to it here. .takeFromWorktop(xrd, Decimal("10000"), ManifestBuilderBucket("xrdBucket")) // The following is a sink method that consumes buckets by depositing them into an // account. When wishing to refer to the bucket created by the previous method the same // name that the bucket was created with is used again here. .accountTryDepositOrAbort(address, ManifestBuilderBucket("xrdBucket"), null) .build(networkId) } Python from radix_engine_toolkit import * network_id: int = 0x02 xrd: Address = known_addresses(network_id).resource_addresses.xrd address: Address = Address( "account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8" ) manifest: TransactionManifest = ( ManifestBuilder().faucet_free_xrd() # The following is a source method that creates a bucket by taking a specified amount of # resources from the worktop and into a bucket. The name to use for this bucket is `xrdBucket`. # Any following method that wishes to refer to this bucket, it must refer to it using the name # given to it here. .take_from_worktop(xrd, Decimal("10000"), ManifestBuilderBucket("xrd_bucket")) # The following is a sink method that consumes buckets by depositing them into an account. When # wishing to refer to the bucket created by the previous method the same name that the bucket # was created with is used again here. .build(network_id) ) Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x02 let xrd = knownAddresses(networkId: networkId).resourceAddresses.xrd let address = try! Address( address: "account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8" ) let manifest = try! ManifestBuilder() .faucetFreeXrd() // The following is a source method that creates a bucket by taking a specified amount of // resources from the worktop and into a bucket. The name to use for this bucket is `xrdBucket`. // Any following method that wishes to refer to this bucket, it must refer to it using the name // given to it here. .takeFromWorktop( resourceAddress: xrd, amount: Decimal(value: "10000"), intoBucket: ManifestBuilderBucket(name: "xrdBucket") ) // The following is a sink method that consumes buckets by depositing them into an account. When // wishing to refer to the bucket created by the previous method the same name that the bucket // was created with is used again here. .accountTryDepositOrAbort( accountAddress: address, bucket: ManifestBuilderBucket(name: "xrdBucket"), authorizedDepositorBadge: nil ) .build(networkId: networkId) Method and Function Calls Due to the large amount of aliases supported by the Radix Engine Toolkit manifest builder, any interactions with native components or packages should not require the use of CallMethod or CallFunction directly; the manifest builder offers high-level methods which it translates to the lower-level CallFunction and CallMethod instructions on behalf of the client. However, outside of native components and packages, there is a need to use CallFunction and CallMethod in manifests. As an example, to call some component and perform a swap. This section shows how the CallMethod and CallFunction methods on the Radix Engine Toolkit manifest builder may be used in an example manifest. This manifest gets some free XRD from the faucet and deposits them into an account by using CallMethod alone. Note The following manifests can be built without the use of CallMethod at all since these are all methods that have aliases. This example uses CallMethod for them just to show how it can be used. C# using RadixEngineToolkit; using static RadixEngineToolkit.RadixEngineToolkitUniffiMethods; using Decimal = RadixEngineToolkit.Decimal; const byte networkId = 0x02; using var xrd = KnownAddresses(networkId).resourceAddresses.xrd; using var faucet = KnownAddresses(networkId).componentAddresses.faucet; using var account = new Address("account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn"); using var manifest = new ManifestBuilder() .CallMethod( new ManifestBuilderAddress.Static(faucet), // <1> "free", new ManifestBuilderValue[] { } // <2> ) .TakeFromWorktop(xrd, new Decimal("10000"), new ManifestBuilderBucket("xrd")) .CallMethod( new ManifestBuilderAddress.Static(account), // <1> "try_deposit_or_abort", new ManifestBuilderValue[] // <2> { new ManifestBuilderValue.BucketValue(new ManifestBuilderBucket("xrd")), new ManifestBuilderValue.EnumValue(0, new ManifestBuilderValue[] { }) // <3> }) .Build(networkId); manifest.StaticallyValidate(); Kotlin import com.radixdlt.ret.*; fun main() { val networkId: UByte = 0x02u; val xrd = knownAddresses(networkId).resourceAddresses.xrd; val faucet = knownAddresses(networkId).componentAddresses.faucet; val account = Address("account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn"); val manifest = ManifestBuilder() .callMethod( ManifestBuilderAddress.Static(faucet), // <1> "free", listOf() // <2> ) .takeFromWorktop(xrd, Decimal("10000"), ManifestBuilderBucket("xrd")) .callMethod( ManifestBuilderAddress.Static(account), // <1> "try_deposit_or_abort", listOf( // <2> ManifestBuilderValue.BucketValue(ManifestBuilderBucket("xrd")), ManifestBuilderValue.EnumValue(0u, listOf()) // <3> ) ) .build(networkId); manifest.staticallyValidate(); } Python from radix_engine_toolkit import * network_id: int = 0x02 xrd: Address = known_addresses(network_id).resource_addresses.xrd faucet: Address = known_addresses(network_id).component_addresses.faucet account: Address = Address( "account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn" ) manifest: TransactionManifest = ( ManifestBuilder() .call_method( ManifestBuilderAddress.STATIC(faucet), # <1> "free", [] # <2> ) .take_from_worktop(xrd, Decimal("10000"), ManifestBuilderBucket("xrd")) .call_method( ManifestBuilderAddress.STATIC(account), # <1> "try_deposit_or_abort", [ # <2> ManifestBuilderValue.BUCKET_VALUE(ManifestBuilderBucket("xrd")), ManifestBuilderValue.ENUM_VALUE(0, []), # <3> ], ) .build(network_id) ) manifest.statically_validate() Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x02 let xrd = knownAddresses(networkId: networkId).resourceAddresses.xrd let faucet = knownAddresses(networkId: networkId).componentAddresses.faucet let account = try! Address(address: "account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn") let manifest = try! ManifestBuilder() .callMethod( address: ManifestBuilderAddress.static(value: faucet), // <1> methodName: "free", args: [] // <2> ) .takeFromWorktop( resourceAddress: xrd, amount: Decimal(value: "10000"), intoBucket: ManifestBuilderBucket(name: "xrd") ) .callMethod( address: ManifestBuilderAddress.static(value: account), // <1> methodName: "try_deposit_or_abort", args: [ // <2> ManifestBuilderValue.bucketValue(value: ManifestBuilderBucket(name: "xrd")), ManifestBuilderValue.enumValue(discriminator: 0, fields: []), // <3> ]) .build(networkId: networkId) try! manifest.staticallyValidate() There are several things to note about the examples above. Each of the following points map to an area of the above examples. - The first argument to a CallMethod is a ManifestBuilderAddress which is a sum type (https://en.wikipedia.org/wiki/Algebraic_data_type) that can either be Static or Named. - Static addresses are addresses that are known prior to the manifest’s execution. As an example, the address of XRD, the Faucet component, a package that’s already been published, or a component that has already been instantiated in a prior transaction. - Named addresses are addresses that are not known prior to the manifests’s execution. They’re allocated during the manifest’s runtime as a result of AllocateGlobalAddress instructions. - The last argument to a CallMethod instruction is a list or array of ManifestBuilderValue which are the arguments that the method will be called with. In the case of the call to the free method on the faucet, there are no arguments; thus, an empty array or list is provided. On the other hand, the try_deposit_or_abort method on account components expects a bucket and an Option (as seen here (https://github.com/radixdlt/radixdlt-scrypto/blob/d16b3ffa65fea417e9c6d01d76456a0b36c060fa/radix-engine-interface/src/blueprints/account/invocations.rs#L317-L321) ). Thus, the arguments provided through the manifest builder is a ManifestBuilderValue.BucketValue and a ManifestBuilderValue.EnumValue. Any Option can be modeled as a ManifestBuilderValue.EnumValue of discriminator 0 and no fields. - Option, Result, and other enums can be modeled as ManifestBuilderValue.EnumValue. The discriminator is the index of the enum variant. As an example, for Option the discriminator of None is 0 and the discriminator of Some is 1 (as seen here (https://doc.rust-lang.org/src/core/option.rs.html#563-572) ). Similarly, for Result, the discriminator of Ok is 0 and the discriminator of Err is 1. - For a comprehensive list of data types compatible with ManifestBuilder, refer to the Data Types (https://github.com/gguuttss/radix-docs/blob/master/build/scrypto-1/data-types.md) documentation. When working in other languages like Python, certain types such as i32 are not directly supported. Instead, you must use the corresponding type from the Radix Engine Toolkit library. For example, when using .CALL_METHOD with an i32 argument (e.g., -6927i32), you should use ManifestBuilderValue.I32_VALUE(-6927) instead. An example for Python is provided in the next section. Usage Examples Account to Account Transfer C# using RadixEngineToolkit; using static RadixEngineToolkit.RadixEngineToolkitUniffiMethods; using Decimal = RadixEngineToolkit.Decimal; const byte networkId = 0x02; var xrd = KnownAddresses(networkId).resourceAddresses.xrd; using var address1 = new Address("account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn"); using var address2 = new Address("account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8"); using var manifest = new ManifestBuilder() .AccountLockFeeAndWithdraw(address1, new Decimal("10"), xrd, new Decimal("1000")) .TakeFromWorktop(xrd, new Decimal("1000"), new ManifestBuilderBucket("xrdBucket")) .AccountTryDepositOrAbort(address2, new ManifestBuilderBucket("xrdBucket"), null) .Build(networkId); Kotlin import com.radixdlt.ret.* fun main(args: Array) { val networkId: UByte = 0x02u val xrd = knownAddresses(networkId).resourceAddresses.xrd val address1 = Address("account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn") val address2 = Address("account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8") val manifest = ManifestBuilder() .accountLockFeeAndWithdraw(address1, Decimal("10"), xrd, Decimal("1000")) .takeFromWorktop(xrd, Decimal("1000"), ManifestBuilderBucket("xrdBucket")) .accountTryDepositOrAbort(address2, ManifestBuilderBucket("xrdBucket"), null) .build(networkId) } Python from radix_engine_toolkit import * network_id: int = 0x02 xrd: Address = known_addresses(network_id).resource_addresses.xrd address1: Address = Address( "account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn" ) address2: Address = Address( "account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8" ) component_address: Address = Address( "component_tdx_2_1234kwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5yabcd" ) manifest: TransactionManifest = ( ManifestBuilder() .account_lock_fee_and_withdraw(address1, Decimal("10"), xrd, Decimal("1000")) .take_from_worktop(xrd, Decimal("1000"), ManifestBuilderBucket("xrdBucket")) .account_try_deposit_or_abort(address2, ManifestBuilderBucket("xrdBucket")) .build(network_id) ) manifest2: TransactionManifest = ( .account_lock_fee_and_withdraw(address1, Decimal("10"), xrd, Decimal("1000")) .take_from_worktop(xrd, Decimal("1000"), ManifestBuilderBucket("xrdBucket")) .call_method( ManifestBuilderAddress.STATIC(component_address), "function_name",[ ManifestBuilderValue.I32_VALUE(-6927), ManifestBuilderValue.BUCKET_VALUE(ManifestBuilderBucket("xrdBucket")) ]) .account_deposit_entire_worktop(addres1) ) Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x02 let xrd = knownAddresses(networkId: networkId).resourceAddresses.xrd let address1 = try! Address( address: "account_tdx_2_168e8u653alt59xm8ple6khu6cgce9cfx9mlza6wxf7qs3wwdyqvusn" ) let address2 = try! Address( address: "account_tdx_2_169ukwfsne0zvwrvsyk3mm3x7m6hggup52t6ng547m9a2qp6q5y99h8" ) let manifest = try! ManifestBuilder() .accountLockFeeAndWithdraw( accountAddress: address1, amountToLock: Decimal(value: "10"), resourceAddress: xrd, amount: Decimal(value: "1000") ) .takeFromWorktop( resourceAddress: xrd, amount: Decimal(value: "10000"), intoBucket: ManifestBuilderBucket(name: "xrdBucket") ) .accountTryDepositOrAbort( accountAddress: address2, bucket: ManifestBuilderBucket(name: "xrdBucket"), authorizedDepositorBadge: nil ) .build(networkId: networkId) ## Derivation URL: https://radix.wiki/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-derivation Updated: 2026-02-18 Summary: This article is a usage guide for the derivation module of the Radix Engine Toolkit which contains a set of functions for deriving data from some other data. Fo Integrate > Radix Engine Toolkit > Usage Guide > Derivation — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-derivation.md) This article is a usage guide for the derivation module of the Radix Engine Toolkit which contains a set of functions for deriving data from some other data. For example, this module has the logic required to derive virtual account addresses from public keys. Virtual Account Address From a Public Key Deterministically maps a public key to its associated virtual account component address. An account can be virtual or physical (see the " Account Virtualization (../../../reference/radix-engine/native-blueprints/account.md#account-virtualization) " article). A virtual account is an account with no physical state on the ledger, just virtual state, and whose component address is derived from the public key associated with the private key controlling the account. A virtual account only becomes physical after the first transaction that interacts with the account. Function Name Language Function Name C# Address.VirtualAccountAddressFromPublicKey Kotlin Address.VirtualAccountAddressFromPublicKey Python Address.virtual_account_address_from_public_key Swift Address.VirtualAccountAddressFromPublicKey Arguments - PublicKey - The public key to map to a virtual account component address. This can be either an Ed25519 or a Secp256k1 public key. - u8 - The ID of the network that the address will be used for. This is used for the Bech32m encoding of the address. Returns - Address - The derived virtual account component address. Exceptions Exceptions are thrown in the following cases: - If the public key is invalid for its curve. As an example, a 33-byte long Ed25519 public key. - If the network ID is not between 0×00 and 0xFF. This is only the case in languages that do not have an 8-bit unsigned integer type and thus such things can not be enforced at compile time. Examples C# using RadixEngineToolkit; const byte networkId = 0x01; var publicKeyBytes = Convert.FromHexString( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ); var publicKey = new PublicKey.Secp256k1( publicKeyBytes ); var virtualAccountAddress = Address.VirtualAccountAddressFromPublicKey( publicKey, networkId ); Console.WriteLine(virtualAccountAddress.AsStr()); Kotlin import com.radixdlt.ret.* @OptIn(ExperimentalStdlibApi::class, ExperimentalUnsignedTypes::class) fun main(args: Array) { val networkId: UByte = 0x01u val publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf".hexToUByteArray() val publicKey = PublicKey.Secp256k1(publicKeyBytes.toList()) val virtualAccountAddress = Address.virtualAccountAddressFromPublicKey(publicKey, networkId); println(virtualAccountAddress.asStr()) } Python from radix_engine_toolkit import * network_id: int = 0x01 public_key_bytes: bytearray = bytearray.fromhex( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ) public_key: PublicKey = PublicKey.SECP256K1(public_key_bytes) virtual_account_address: Address = Address.virtual_account_address_from_public_key( public_key, network_id ) print(virtual_account_address.as_str()) Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x01 let publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" .hexToData()!; let publicKey = PublicKey.secp256k1(value: Array(publicKeyBytes)) let virtualAccountAddress = try! Address.virtualAccountAddressFromPublicKey( publicKey: publicKey, networkId: networkId ) print(virtualAccountAddress.asStr()) Virtual Identity Address From a Public Key Deterministically maps a public key to its associated virtual identity component address. Function Name Language Function Name C# Address.VirtualIdentityAddressFromPublicKey Kotlin Address.VirtualIdentityAddressFromPublicKey Python Address.virtual_identity_address_from_public_key Swift Address.VirtualIdentityAddressFromPublicKey Arguments - PublicKey - The public key to map to a virtual identity component address. This can be either an Ed25519 or a Secp256k1 public key. - u8 - The ID of the network that the address will be used for. This is used for the Bech32m encoding of the address. Returns - Address - The derived virtual identity component address. Exceptions Exceptions are thrown in the following cases: - If the public key is invalid for its curve. As an example, a 33-byte long Ed25519 public key. - If the network ID is not between 0×00 and 0xFF. This is only the case in languages that do not have an 8-bit unsigned integer type and thus such things can not be enforced at compile time. Examples C# using RadixEngineToolkit; const byte networkId = 0x01; var publicKeyBytes = Convert.FromHexString( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ); var publicKey = new PublicKey.Secp256k1( publicKeyBytes ); var virtualIdentityAddress = Address.VirtualIdentityAddressFromPublicKey( publicKey, networkId ); Console.WriteLine(virtualIdentityAddress.AsStr()); Kotlin import com.radixdlt.ret.* @OptIn(ExperimentalStdlibApi::class, ExperimentalUnsignedTypes::class) fun main(args: Array) { val networkId: UByte = 0x01u val publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf".hexToUByteArray() val publicKey = PublicKey.Secp256k1(publicKeyBytes.toList()) val virtualIdentityAddress = Address.virtualIdentityAddressFromPublicKey(publicKey, networkId); println(virtualIdentityAddress.asStr()) } Python from radix_engine_toolkit import * network_id: int = 0x01 public_key_bytes: bytearray = bytearray.fromhex( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ) public_key: PublicKey = PublicKey.SECP256K1(public_key_bytes) virtual_identity_address: Address = Address.virtual_identity_address_from_public_key( public_key, network_id ) print(virtual_identity_address.as_str()) Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x01 let publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" .hexToData()!; let publicKey = PublicKey.secp256k1(value: Array(publicKeyBytes)) let virtualIdentityAddress = try! Address.virtualIdentityAddressFromPublicKey( publicKey: publicKey, networkId: networkId ) print(virtualIdentityAddress.asStr()) Virtual Account Address From an Olympia Account Address Deterministically maps an Olympia account address to a Babylon virtual account component address. Function Name Language Function Name C# Address.VirtualAccountAddressFromOlympiaAddress Kotlin Address.VirtualAccountAddressFromOlympiaAddress Python Address.virtual_account_address_from_olympia_address Swift Address.VirtualAccountAddressFromOlympiaAddress Arguments - OlympiaAddress - The Olympia account address to map to a Babylon virtual account component address. - u8 - The ID of the network that the address will be used for. This is used for the Bech32m encoding of the address. Returns - Address - The derived virtual account component address. Exceptions Exceptions are thrown in the following cases: - If the Olympia account address is invalid. - If the network ID is not between 0×00 and 0xFF. This is only the case in languages that do not have an 8-bit unsigned integer type and thus such things can not be enforced at compile time. Examples C# using RadixEngineToolkit; const byte networkId = 0x01; var olympiaAccountAddress = new OlympiaAddress( "rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe" ); var virtualAccountAddress = Address.VirtualAccountAddressFromOlympiaAddress( olympiaAccountAddress, networkId ); Console.WriteLine(virtualAccountAddress.AsStr()); Kotlin import com.radixdlt.ret.* fun main(args: Array) { val networkId: UByte = 0x01u val olympiaAccountAddress = OlympiaAddress("rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe"); val virtualAccountAddress = Address.virtualAccountAddressFromOlympiaAddress(olympiaAccountAddress, networkId); println(virtualAccountAddress.asStr()) } Python from radix_engine_toolkit import * network_id: int = 0x01 olympia_account_address: OlympiaAddress = OlympiaAddress( "rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe" ) virtual_account_address: Address = Address.virtual_account_address_from_olympia_address( olympia_account_address, network_id ) print(virtual_account_address.as_str()) Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x01 let olympiaAccountAddress = OlympiaAddress( address: "rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe" ); let virtualIdentityAddress = try! Address.virtualAccountAddressFromOlympiaAddress( olympiaAccountAddress: olympiaAccountAddress, networkId: networkId ); print(virtualIdentityAddress.asStr()) Resource Address from an Olympia Resource Identifier Deterministically maps an Olympia resource address to a Babylon resource address. Function Name Language Function Name C# Address.ResourceAddressFromOlympiaResourceAddress Kotlin Address.ResourceAddressFromOlympiaResourceAddress Python Address.resource_address_from_olympia_resource_address Swift Address.ResourceAddressFromOlympiaResourceAddress Arguments - OlympiaAddress - The Olympia resource address to map to a Babylon resource address. - u8 - The ID of the network that the address will be used for. This is used for the Bech32m encoding of the address. Returns - Address - The derived resource address. Exceptions Exceptions are thrown in the following cases: - If the Olympia resource address is invalid. - If the network ID is not between 0×00 and 0xFF. This is only the case in languages that do not have an 8-bit unsigned integer type and thus such things can not be enforced at compile time. Examples C# using RadixEngineToolkit; const byte networkId = 0x01; var olympiaResourceAddress = new OlympiaAddress( "xrd_rr1qy5wfsfh" ); var resourceAddress = Address.ResourceAddressFromOlympiaResourceAddress( olympiaResourceAddress, networkId ); Console.WriteLine(resourceAddress.AsStr()); Kotlin import com.radixdlt.ret.* fun main(args: Array) { val networkId: UByte = 0x01u val olympiaResourceAddress = OlympiaAddress("xrd_rr1qy5wfsfh"); val resourceAddress = Address.resourceAddressFromOlympiaResourceAddress(olympiaResourceAddress, networkId); println(resourceAddress.asStr()) } Python from radix_engine_toolkit import * network_id: int = 0x01 olympia_resource_address: OlympiaAddress = OlympiaAddress("xrd_rr1qy5wfsfh") resource_address: Address = Address.resource_address_from_olympia_resource_address( olympia_resource_address, network_id ) print(resource_address.as_str()) Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x01 let olympiaResourceAddress = OlympiaAddress( address: "xrd_rr1qy5wfsfh" ); let virtualIdentityAddress = try! Address.resourceAddressFromOlympiaResourceAddress( olympiaResourceAddress: olympiaResourceAddress, networkId: networkId ); print(virtualIdentityAddress.asStr()) Public Key from an Olympia Account Address Derives the public key associated with an Olympia account address. Function Name Language Function Name C# Address.DerivePublicKeyFromOlympiaAccountAddress Kotlin Address.DerivePublicKeyFromOlympiaAccountAddress Python Address.derive_public_key_from_olympia_account_address Swift Address.DerivePublicKeyFromOlympiaAccountAddress Arguments - OlympiaAddress - The Olympia address to derive the public key from. Returns - PublicKey - The derived public key. This will always be a Secp256k1 public key as this is the only curve that was supported in Olympia. Exceptions Exceptions are thrown in the following cases: - If the Olympia resource address is invalid. Examples C# using RadixEngineToolkit; using static RadixEngineToolkit.RadixEngineToolkitUniffiMethods; var olympiaAccountAddress = new OlympiaAddress( "rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe" ); var publicKey = DerivePublicKeyFromOlympiaAccountAddress( olympiaAccountAddress ); Kotlin import com.radixdlt.ret.* fun main(args: Array) { val olympiaAccountAddress = OlympiaAddress("rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe"); val publicKey = derivePublicKeyFromOlympiaAccountAddress(olympiaAccountAddress) } Python from radix_engine_toolkit import * olympia_account_address: OlympiaAddress = OlympiaAddress( "rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe" ) public_key: PublicKey = derive_public_key_from_olympia_account_address( olympia_account_address ) Swift import EngineToolkit import Foundation let olympiaAccountAddress = OlympiaAddress( address: "rdx1qsp4s9twd636k3c89q36wg5n9d4z940cgf36dfjja4zj7sttjqvhlncsmfwxe" ); let publicKey = try! derivePublicKeyFromOlympiaAccountAddress( olympiaResourceAddress: olympiaAccountAddress ) Olympia Account Address from a Public Key Derives an Olympia account address from a Secp256k1 public key. Function Name Language Function Name C# Address.DeriveOlympiaAccountAddressFromPublicKey Kotlin Address.DeriveOlympiaAccountAddressFromPublicKey Python Address.derive_olympia_account_address_from_public_key Swift Address.DeriveOlympiaAccountAddressFromPublicKey Arguments - PublicKey - The public key to map to an Olympia account address. - OlympiaNetwork - The network that the Olympia account address is to be used for. This will be used for the Bech32 encoding of the Olympia address. Returns - OlympiaAddress - The derived Olympia account address. Exceptions Exceptions are thrown in the following cases: - If the public key is not a valid Secp256k1 public key. Examples C# using RadixEngineToolkit; using static RadixEngineToolkit.RadixEngineToolkitUniffiMethods; var publicKeyBytes = Convert.FromHexString( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ); var publicKey = new PublicKey.Secp256k1( publicKeyBytes ); var olympiaAccountAddress = DeriveOlympiaAccountAddressFromPublicKey( publicKey, OlympiaNetwork.MAINNET ); Console.WriteLine( olympiaAccountAddress.AsStr() ); Kotlin import com.radixdlt.ret.* @OptIn(ExperimentalStdlibApi::class, ExperimentalUnsignedTypes::class) fun main(args: Array) { val publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf".hexToUByteArray() val publicKey = PublicKey.Secp256k1(publicKeyBytes.toList()) val olympiaAccountAddress = deriveOlympiaAccountAddressFromPublicKey(publicKey, OlympiaNetwork.MAINNET) println(olympiaAccountAddress.asStr()) } Python from radix_engine_toolkit import * public_key_bytes: bytearray = bytearray.fromhex( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ) public_key: PublicKey = PublicKey.SECP256K1(public_key_bytes) olympia_account_address: OlympiaAddress = ( derive_olympia_account_address_from_public_key(public_key, OlympiaNetwork.MAINNET) ) print(olympia_account_address.as_str()) Swift import EngineToolkit import Foundation let publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" .hexToData()!; let publicKey = PublicKey.secp256k1(value: Array(publicKeyBytes)) let olympiaAccountAddress = try! deriveOlympiaAccountAddressFromPublicKey(publicKey: publicKey, olympiaNetwork: OlympiaNetwork.mainnet) print(olympiaAccountAddress.asStr()) Virtual Signature Badge Non-fungible Global ID from Public Key Derives the non-fungible global ID of the virtual signature badge associated with a given public key. Function Name Language Function Name C# Address.VirtualSignatureBadge Kotlin Address.VirtualSignatureBadge Python Address.virtual_signature_badge Swift Address.VirtualSignatureBadge Arguments - PublicKey - The public key to calculate the non-fungible global ID of the virtual signature badge for. - u8 - The ID of the network that the address will be used for. This is used for the Bech32m encoding of the address. Returns - NonFungibleGlobalId - The non-fungible global ID of the virtual signature badge of the passed public key. Exceptions Exceptions are thrown in the following cases: - If the public key is invalid for its curve. As an example, a 33-byte long Ed25519 public key. - If the network ID is not between 0×00 and 0xFF. This is only the case in languages that do not have an 8-bit unsigned integer type and thus such things can not be enforced at compile time. Examples C# using RadixEngineToolkit; const byte networkId = 0x01; var publicKeyBytes = Convert.FromHexString( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ); var publicKey = new PublicKey.Secp256k1( publicKeyBytes ); var virtualSignatureNonFungibleGlobalId = NonFungibleGlobalId.VirtualSignatureBadge( publicKey, networkId ); Console.WriteLine( virtualSignatureNonFungibleGlobalId.AsStr() ); Kotlin import com.radixdlt.ret.* @OptIn(ExperimentalStdlibApi::class, ExperimentalUnsignedTypes::class) fun main(args: Array) { val networkId: UByte = 0x01u val publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf".hexToUByteArray() val publicKey = PublicKey.Secp256k1(publicKeyBytes.toList()) val virtualSignatureNonFungibleGlobalId = NonFungibleGlobalId.virtualSignatureBadge(publicKey, networkId); println(virtualSignatureNonFungibleGlobalId.asStr()) } Python from radix_engine_toolkit import * network_id: int = 0x01 public_key_bytes: bytearray = bytearray.fromhex( "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" ) public_key: PublicKey = PublicKey.SECP256K1(public_key_bytes) virtual_signature_non_fungible_global_id: NonFungibleGlobalId = ( NonFungibleGlobalId.virtual_signature_badge(public_key, network_id) ) print(virtual_signature_non_fungible_global_id.as_str()) Swift import EngineToolkit import Foundation let networkId: UInt8 = 0x01 let publicKeyBytes = "0358156e6ea3ab47072823a722932b6a22d5f84263a6a652ed452f416b90197fcf" .hexToData()!; let publicKey = PublicKey.secp256k1(value: Array(publicKeyBytes)) let virtualSignatureNonFungibleGlobalId = try! NonFungibleGlobalId.virtualSignatureBadge(publicKey: publicKey, networkId: networkId) print(virtualSignatureNonFungibleGlobalId.asStr()) ## Usage Guide URL: https://radix.wiki/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/radix-engine-toolkit-usage-guide Updated: 2026-02-18 Summary: This section is a usage guide of the various Radix Engine Toolkit wrappers alongside example usages. Integrate > Radix Engine Toolkit > Usage Guide — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-usage-guide/README.md) This section is a usage guide of the various Radix Engine Toolkit wrappers alongside example usages. Notes - All of the Swift examples provided in this section depend on a utility file that contains the following code: import Foundation extension String { func hexToData() -> Data? { var data = Data() var startIndex = self.startIndex while startIndex < self.endIndex { let endIndex = self.index(startIndex, offsetBy: 2) if let byte = UInt8(self[startIndex.. Radix Engine Toolkit > Getting Started — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-installation.md) Choosing a Toolkit As mentioned in the previous pages, the Radix Engine Toolkit has wrappers in 5 main languages: C#, Kotlin, Python, Swift, and TypeScript. Rust, being the language of Scrypto and the Radix Engine, has the best support. If you decide to use the Radix Engine Toolkit, there are certain tradeoffs between the various different wrappers that you must consider. The TypeScript Radix Engine Toolkit is different from the C#, Kotlin, Python, and Swift one in that it does not use UniFFI and is a manually written wrapper. This means that it is simpler, its interface doesn’t change very often, but is less capable than the other toolkit wrappers and can’t do nearly as much as them. It is intended to provide frontend clients and exchange integrators with the tooling needed to construct build manifests and sign transactions either programmatically or for the purpose of integrating with the Radix dApp Toolkit and functionality surrounding that. It is typically the toolkit that you want to use in your frontend but you might not want to use it in your backend and might want to use a more powerful wrapper. The C#, Kotlin, Python, and Swift wrappers are all UniFFI wrappers which means that they are automatically generated bindings from the Rust library that powers them. These wrappers have a much more powerful manifest and transaction builders, support for typed events, the SBOR codec, derivations, and more features that aligns them more closely to the toolkit’s vision of “providing non-Rust programming languages with the Scrypto and Radix Engine primitives present in Rust”. These are an ideal choice for use in backends and in cases where the features provided by these bindings are needed. Finally, Rust offers the best support for everything that users might want to do and it should be the first language to consider for any application that wishes to interact with the Radix ledger as it has the best support since it is the language of Scrypto and the Radix engine. There does not exist a Radix Engine Toolkit in Rust as one is not needed because all of the functionality present in the toolkit is in the transactions and radix-engine-interface crates. It is recommended that users prefer the use of Rust for their needs, then the Uniffi wrappers (C#, Kotlin, Python, and Swift), then the TypeScript toolkit, in this order. Installation This document covers the installation of the various Radix Engine Toolkit wrappers through the appropriate package managers. C# (Nuget) The C# Radix Engine Toolkit is published to NuGet under the name: RadixDlt.RadixEngineToolkit (https://www.nuget.org/packages/RadixDlt.RadixEngineToolkit) . If you have the doenet CLI tool installed then you can install the C# toolkit wrapper through: dotnet add package RadixDlt.RadixEngineToolkit It may also be installed through the NuGet GUI offered by various IDEs such as Visual Studio and JetBrains Rider. Kotlin (Maven or Gradle) The Kotlin Radix Engine Toolkit is not yet published to Maven Central. Instead, the relevant JAR can be downloaded from https://github.com/radixdlt/radix-engine-toolkit/packages/1895806/versions (https://github.com/radixdlt/radix-engine-toolkit/packages/1895806/versions) To make updates easier, you may need to add github as an Apache Maven repository (https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-apache-maven-registry) or a Gradle repository (https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-gradle-registry) . Python (PyPi) The Python Radix Engine Toolkit is published to PyPi under the name radix-engine-toolkit (https://pypi.org/project/radix-engine-toolkit/) . This can be installed by: pip install radix-engine-toolkit Depending on your operating system and environment you may need to use pip3, python -m pip, or python3 -m pip. Swift (SPM) Adding the Swift Radix Engine Toolkit as a dependency can not be done through the CLI, only through manual edits of the Package.swift file. The following is an example of the modifications to apply to it. // swift-tools-version: 5.9 import PackageDescription let package = Package( name: "example", platforms: [ .macOS(.v12), // <1> .iOS(.v11), // <1> ], dependencies: [ .package(url: "https://github.com/radixdlt/swift-engine-toolkit", exact: "1.0.0") // <2> ], targets: [ .executableTarget( name: "example", dependencies: [ .product(name: "EngineToolkit", package: "swift-engine-toolkit") // <3> ] ), ] ) The three items below need to be added to the Package.swift file: - The minimum MacOS and iOS versions need to be set to v12 and v11 respectively or higher. This is due to the minimum version requirements of the Radix Engine Toolkit. - The Swift Radix Engine Toolkit needs to be added as a dependency and the version needs to be specified. - Each of the targets that depend on the Swift Radix Engine Toolkit should have the EngineToolkit product added to it. Go (pkg.go.dev) The Go Radix Engine Toolkit is published as github.com/radixdlt/radix-engine-toolkit-go/v2 (https://pkg.go.dev/github.com/radixdlt/radix-engine-toolkit-go/v2) . Installation instructions are available here (https://pkg.go.dev/github.com/radixdlt/radix-engine-toolkit-go/v2#section-readme) . Hello World The “Hello World” equivalent of the Radix Engine Toolkit to verify that it has been installed successfully and it works in the current environment is the build_information function which prints information about the build of the Radix Engine Toolkit Core that is used such as: its version and the version of Scrypto it depends on. C# using RadixEngineToolkit; Console.WriteLine(RadixEngineToolkitUniffiMethods.BuildInformation()); Kotlin import com.radixdlt.ret.* fun main(args: Array) { println(buildInformation()) } Python from radix_engine_toolkit import * print(build_information()) Swift import EngineToolkit; print(buildInformation()) ## Architecture URL: https://radix.wiki/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit-architecture Updated: 2026-02-18 Summary: The Radix Engine Toolkit is written in Rust and is split between three main crates: Integrate > Radix Engine Toolkit > Architecture — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/radix-engine-toolkit-architecture.md) The Radix Engine Toolkit is written in Rust and is split between three main crates: - Radix Engine Toolkit Core: Includes the implementation of the core functionality of the Radix Engine Toolkit in a functional way. This crate makes no assumptions about how it will be exposed or interfaced with. None of the models in this crate are either Serde or UniFFI serializable. This helps establish a clear separation of concerns: the core of the toolkit implements the functionality alone without any assumptions on the interface, and the interface crates implement the interface without any knowledge of the implementation of the underlying functionality. This allows the Radix Engine Toolkit to have extensible interfaces. As an example, the Radix Engine Toolkit could be extended to be exposed over a REST API or JSON RPC by adding an interface crate. - Radix Engine Toolkit Interfaces - Radix Engine Toolkit UniFFI: Exposes the Radix Engine Toolkit through a UniFFI (https://mozilla.github.io/uniffi-rs/) interface where each of the exposed functions maps directly to one or more functions from the Radix Engine Toolkit Core. This is the interface that the Swift, Kotlin, C#, and Python Radix Engine Toolkit wrappers are based on. UniFFI allows for bindings to be generated from the Rust code which allowed the Radix Engine Toolkit to support a variety of different programming languages in a fairly short time.  - Radix Engine Toolkit JSON: Exposes the Radix Engine Toolkit through a C-ABI Foreign Function Interface (FFI) where all of the inputs and outputs of functions are pointers to JSON strings. The TypeScript Radix Engine toolkit wraps a WASM module of this toolkit interface. This interface offers a simple API that is WASM-compatible and thus suitable for the web and other platforms where a WASM runtime is available.   The following diagram visualizes the different layers that exist in the Radix Engine toolkit stack: The Swift, Kotlin, C#, and Python implementations of the Radix Engine Toolkit are referred to as toolkit wrappers, as they wrap a Radix Engine Toolkit dynamic library, make calls to it, and return results back to the caller. From this point onward, Radix Engine Toolkit implementation in non-Rust languages will be referred to as toolkit wrappers or Radix Engine Toolkit wrappers from this point onward. The UniFFI toolkit wrappers such as the ones written in Kotlin, C#, and Python do not have either public or private GitHub repositories. This is because at Radix Engine Toolkit build time in the GitHub CI, UniFFI is used to generate the bindings (i.e., the .kt, .cs, and .py files required to interface with the dynamic libraries) and then CI packages those bindings up and publishes them to the appropriate package managers such as Maven Central, NuGet, and PyPi. Therefore, the Kotlin, C#, and Python Radix Engine Toolkit do not require traditional GitHub repositories as they're fully automatically generated and have no need for things such as commit history, issues, pull requests, and so on. Any issues or pull requests should be opened against the main Radix Engine Toolkit repository here (https://github.com/radixdlt/radix-engine-toolkit/) . Out of the four UniFFI toolkit wrappers only the Swift Radix Engine Toolkit has a GitHub repository (https://github.com/radixdlt/swift-engine-toolkit) as it is essential for the Swift Package Manager (SPM). However, no manual commits are made to that repository; just commits from the GitHub CI runner during the Radix Engine Toolkit builds. Compatibility As described above, all of the toolkit wrappers contain dynamic libraries. These libraries are platform-specific and thus do not run everywhere, just most of the environments. The following table shows the platforms supported by the various toolkit wrappers: LLVM Target Triple Swift Kotlin (Android) Kotlin C# Python x86_64-pc-windows-gnu ✅ ✅ ✅ aarch64-pc-windows-msvc x86_64-unknown-linux-gnu ✅ ✅ ✅ aarch64-unknown-linux-gnu ✅ ✅ ✅ aarch64-apple-darwin ✅ ✅ ✅ ✅ x86_64-apple-darwin ✅ ✅ ✅ ✅ x86_64-apple-ios ✅ aarch64-apple-ios ✅ aarch64-apple-ios-sim ✅ aarch64-linux-android ✅ armv7-linux-androideabi ✅ This means that the different toolkit wrappers should run on ARM and x86_64 processors and Windows, Linux, and MacOS operating systems with no issues. The only exception is Windows running on ARM processors which is quite uncommon. If you encounter problems relating to supported targets then feel free to open an issue in the Radix Engine Toolkit  repository (https://github.com/radixdlt/radix-engine-toolkit/)  and the team will look into adding additional target support to the Radix Engine Toolkit. ## Radix Engine Toolkit URL: https://radix.wiki/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit Updated: 2026-02-18 Summary: Rust Radix Engine Toolkit Examples Integrate > Radix Engine Toolkit — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/README.md) Rust Radix Engine Toolkit Examples Example usage of the various Radix Engine Toolkit is provided in the experimental-examples repository (https://github.com/radixdlt/experimental-examples/tree/main/radix-engine-toolkit) . The Radix Engine Toolkit is a multi-language library that provides various Scrypto and Radix Engine types, functions, primitives, and concepts essential for building off-ledger applications in a particular language. If the low-level support of the Radix Engine Toolkit is not provided for a particular language then the development of off-ledger applications in that particular language is possible albeit difficult as different parts of the stack would need to be implemented by the author. Radix Engine Toolkit wrappers are available in the following programming languages: - TypeScript[1] - Python - Swift - C# - Kotlin Rust Support The above list does not mention the Rust Radix Engine Toolkit for two reasons: - The above is a list of the Radix Engine Toolkit wrappers. The Rust implementation is not a wrapper around any lower-level library. - The objective of the Radix Engine Toolkit is to provide Scrypto and Radix Engine types, functions, primitives, and concepts to languages that do not have it; Rust already has an implementation in the form of the Scrypto and Radix Engine crates. Functionality The functionality provided by the Radix Engine Toolkit is limited to just off-ledger functionality. This means that the Radix Engine Toolkit can not be used for things such as querying the status of transactions or querying the account balance as they require knowledge of ledger state which the Radix Engine Toolkit does not have. However, the Radix Engine Toolkit can be used for things that do not require ledger state such as the construction of manifests and transactions and derivation of addresses. The following is a list of the functionalities provided by the Radix Engine Toolkit: - Transaction - Manifest Building - Construction - Compilation - Decompilation - Identifier Hashing - Static Validation - Address Extraction - Execution Analysis - Transaction Types - Derivation - Derivation of virtual account and identity addresses. - Olympia to Babylon account and resource address mapping. - Olympia address derivation. - Virtual signature non-fungible global IDs. - Scrypto and Manifest SBOR - Decoding of SBOR payloads to string representations - (limited) Encoding of SBOR string representations. - Events - Decoding of events emitted by native components into strongly typed models. - Utils - Hashing Example Use Cases The Radix Engine Toolkit and the Core/Gateway APIs are the two tools most non-Scrypto applications use and these two tools have different responsibilities and capabilities. The Radix Engine Toolkit is used for things that do not require ledger state while the Core and Gateway APIs are used when such ledger state is required. The following is a non-comprehensive table of examples of where the Radix Engine Toolkit and Core/Gateway APIs are used. It is meant to provide the reader with a rough idea about the start and end of the responsibilities of the Radix Engine Toolkit and the Core/Gateway APIs Tool Example Use Case Radix Engine Toolkit Construction of manifests Construction of Transactions Signing of Transactions Core/Gateway API Submission of Transactions Checking of Transaction Status Streaming Transactions Getting the current Epoch for use in transaction construction The above comparison aside, the following list provides the reader with an idea of areas where the Radix Engine Toolkit can be used to help determine if the Radix Engine Toolkit is the right fit for them: - When developing applications that need to be able to construct and sign transactions programmatically without the need to go through the wallet signing process. This may be needed by wallet developers, exchanges, custodians, bot developers, and other areas that rely on programmatic transactions. In a case like this, the Radix Engine Toolkit is often used in conjunction with either the Core or Gateway APIs. - When developing frontend or backend services that need to understand SBOR, decode it, and present it in a human-readable way. One example of this may be an NFT marketplace that wishes to display the SBOR-encoded non-fungible data to users as human-readable data. - When developing applications that need to be able to decompile transactions and show the manifest as well as the header and signature to users. One example of that would be dashboards and explorers which need to decompile transactions and display the manifest to the user. Other use cases exist as well and they all revolve around the features in the functionalities (README.md#functionality) section. [1] The TypeScript Radix Engine Toolkit is slightly different from the other toolkit wrappers and has a slightly different interface. More information about it can be found here (https://github.com/radixdlt/typescript-radix-engine-toolkit) . ## Gateway API Providers URL: https://radix.wiki/developers/legacy-docs/integrate/network-apis/gateway-api-providers Updated: 2026-02-18 Summary: The following is a list of known Gateway API(README.md) providers. Integrate > Network APIs > Gateway API Providers — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/gateway-api-providers.md) The following is a list of known Gateway API (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) providers. This list is not vetted Inclusion in this list does not imply endorsement or guarantees of API data accuracy or stability. Please look at the individual integrator for more details, to see if they fits your needs. If you provide a Gateway API service and would like included here, please get in touch on Discord or at hello@radixdlt.com (mailto:hello@radixdlt.com) . Radix Foundation The Radix Foundation runs an IP rate-limited Gateway API, the Foundation Gateway. The Foundation Gateway Service is IP rate limited, with rate limits designed only for a user browsing a front-end dApp (with a relatively light request burden, think no more than ~40 requests per page load) alongside using the connect button and their wallet. Front-end dApps will need to use the Gateway responsibly to maintain a good user experience. The exact rate limits imposed by the Foundation Gateway are dependent on the endpoint you are hitting and are subject to occasional tweaking. If you are using the Foundation Gateway, and are concerned about hitting rate limits, you likely want to find another provider. In particular, if you are building a back-end with large or user-driven request profiles, you will need to consider alternatives. You may get away with using the Core API and running your own Node, or using a Node RPC provider. Or, if you need the Gateway API, finding a Gateway API provider, or consider running them yourself. The rate-limited Foundation Gateway is available at: - Mainnet: https://mainnet.radixdlt.com (https://mainnet.radixdlt.com) ( Swagger (https://mainnet.radixdlt.com/swagger/) ) - Stokenet: https://stokenet.radixdlt.com (https://stokenet.radixdlt.com) ( Swagger (https://stokenet.radixdlt.com/swagger/) ) RadixAPI Key Links: Website (https://radixapi.net/) | Telegram Channel (https://t.me/radixapi) RadixAPI (https://radixapi.net/) offer Gateway access as well as additional endpoints / functionality covering lots of integrato and dApp use cases. For more detail, see: - Their Pricing (https://radixapi.net/pricing) and Service Docs (https://docs.radixapi.net/) - Their Swagger API docs (https://api.radixapi.net/docs) - not that this doesn't properly list the Gateway endpoints. All Gateway endpoints are supported under /v1/gateway/mainnet/... NowNodes NowNodes (https://nownodes.io/nodes) has added support for the Radix Gateway API (at xrd-gateway.nownodes.io). Their pricing plans are available here (https://nownodes.io/pricing) and include the option of a month-long free trial. Find a partner Community dApps have partnered together to reduce the overhead of running their own Gateway. It's worth asking in the community for more details. Run your own Running a gateway (run-infrastructure) is relatively complex. It required more resources than a node, and more infrastructure. It also includes running a node, and will require keeping the infrastructure updated regularly along with the network's protocol updates. ## Core API Providers URL: https://radix.wiki/developers/legacy-docs/integrate/network-apis/core-api-providers Updated: 2026-02-18 Summary: The following is a list of known Core API(README.md) providers. Integrate > Network APIs > Core API Providers — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/core-api-providers.md) The following is a list of known Core API (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) providers. This list is not vetted Inclusion in this list does not imply endorsement or guarantees of API data accuracy or stability. Please look at the individual integrator for more details, to see if they fits your needs. If you provide a Core API service and would like included here, please get in touch on Discord or at hello@radixdlt.com (mailto:hello@radixdlt.com) . Radix Foundation The foundation does not provide a public Core API. Grove Grove (https://www.grove.city/) — the founder of POKT network (http://www.pokt.network/) — has integrated Radix (https://www.radixdlt.com/blog/radix-partners-with-grove-to-power-devex-in-defi) to provide API services to community developers. Community developers can use the Core API via Grove for free, up to a total of 5m request per day. Grove is providing the following two options to get started: - Immediate access to a Core API can be obtained for free using the following Grove-powered public endpoint as a base URL: https://radix-mainnet.rpc.grove.city/v1/326002fc/core - For those interested in more granular control over their endpoints, including allowlisting capabilities, logging, and usage insights, a free private Radix endpoint can be minted at https://portal.grove.city (https://portal.grove.city) . NowNodes NowNodes (https://nownodes.io/nodes) has added support for the Radix Core API (at xrd.nownodes.io). Their pricing plans are available here (https://nownodes.io/pricing) and include the option of a month-long free trial. Run your own Running a node (run-infrastructure) is relatively straightforward, and gives you control of your infrastructure, although will require keeping the node updated regularly along with the network's protocol updates. ## Network APIs URL: https://radix.wiki/developers/legacy-docs/integrate/network-apis/network-apis Updated: 2026-02-18 Summary: Radix currently offers a couple of few main APIs, targeted to different use cases. Integrate > Network APIs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/README.md) Main APIs Radix currently offers a couple of few main APIs, targeted to different use cases. Core API Key Links: Full Core API Documentation (https://radix-babylon-core-api.redoc.ly/) | Core API Providers (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/core-api-providers.md) | Typescript Core API SDK (https://www.npmjs.com/package/@radixdlt/babylon-core-api-sdk) The Core API provides low-level and high-level abstractions, and includes: - The long-term support “LTS” sub-api, designed for financial integrators. - A usefully-abstracted but not comprehensive view of the current ledger state. This includes account balances, and other key native components. - Transaction preview, submission + status flow - A committed transaction stream, at varying abstraction levels The Core API is exposed by Radix nodes (run-infrastructure) . It is predominantly intended as a private API, but there are some RPC providers offering it (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/core-api-providers.md) for public integrations. Gateway API Key Links: Full Gateway API Documentation (https://radix-babylon-gateway-api.redoc.ly/) | Gateway API Providers (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/gateway-api-providers.md) | Typescript Gateway API SDK (https://www.npmjs.com/package/@radixdlt/babylon-gateway-api-sdk) The Gateway API provides low-level and high-level abstractions, but is primarily intended for use by dApp website frontends and general network clients like dashboards. It can be used for: - Reading the state of accounts or other components - Transaction preview, submission + status flow; including managing of resubmissions on behalf of users - A filterable committed transaction history - Queries of historic ledger state The Gateway API is exposed by the Network Gateway (run-infrastructure) , and provided by Gateway API Providers (https://github.com/gguuttss/radix-docs/blob/master/integrate/network-apis/gateway-api-providers.md) . Engine State API Key Links: Full Engine State API Documentation (https://radix-babylon-engine-state-api.redoc.ly/) The Engine State API allows for accessing the complete current state of the Engine ledger, at the abstraction of the engine. - Compared to the Core API, it is fully comprehensive, but possibly harder to use, but has fewer, more general endpoints. - It can be useful for some dApp developers looking to read current application state without running a Gateway, and at a cleaner abstraction level. - It is useful for integrators to explore how the engine thinks about state. The Engine State API is exposed by Radix nodes (run-infrastructure) . The node has to be explicitly configured to enable this API. Details on how to configure the node can be found in the v1.1.3.1 release notes (https://github.com/radixdlt/babylon-node/releases/tag/v1.1.3.1) . Community APIs If the Core and Gateway APIs don’t currently meet your needs, members of the community have built their own APIs, which may have just the endpoint you’re looking for. These are unvetted Inclusion in this list does not imply endorsement or make any claims of correctness or stability. Please look at the individual service for more details, and judge for yourself if it fits your needs. If you are are responsible for providing an API service and would like it to be included here, please get in touch on Discord or at hello@radixdlt.com (mailto:hello@radixdlt.com) . RadixAPI Key Links: Website (https://radixapi.net/) | Telegram Channel (https://t.me/radixapi) RadixAPI (https://radixapi.net/) provides additional endpoints and functionality (https://docs.radixapi.net/) to the official Gateway API to make it easier to obtain specific data. These include: - Creating and verifying ROLA (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/rola-radix-off-ledger-auth.md) challenges - Finding the owners of a fungible or non fungible resource - Finding the owners of validator stake and pool unit tokens - Receiving all transaction data via WebSocket - …and more Other APIs System API Key Links: Full System API Documentation (https://radix-babylon-system-api.redoc.ly/) Radix Nodes also expose the System API, which can be used for debugging the node. Its documentation is available on ReDocly here (https://radix-babylon-system-api.redoc.ly/) . ## Integrate with Radix FAQs URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/integrate-with-radix-faqs Updated: 2026-02-18 Summary: If you wish to see accounts in the official Radix Wallet, you will need to derive the account addresses using the same derivation scheme from a seed phrase. Thi Integrate > Integrate with Radix > Integrate with Radix FAQs — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/integrate-with-radix-faqs.md) Using HD derivation compatible with the official Radix Wallet If you wish to see accounts in the official Radix Wallet, you will need to derive the account addresses using the same derivation scheme from a seed phrase. This uses SLIP-10, as per the RustDoc here (https://github.com/radixdlt/wallet-compatible-derivation/blob/e27de534a305561b1d7a9bbf8bc6c03c066d5425/crates/wallet_compatible_derivation/src/account_path.rs#L3) . A full reference implementation for educational purposes is available here: https://github.com/radixdlt/wallet-compatible-derivation (https://github.com/radixdlt/wallet-compatible-derivation) Transitioning from the Olympia Javascript SDK If you’re currently using Radix Olympia Javascript libraries for your HD derivation such as @radixdlt/account (https://www.npmjs.com/package/@radixdlt/account) , you can convert the in-memory private keys generated this way using the following: // Assuming you have these in scope: const hdMasterSeed = //... const derivationPath = //... // Then the following generates a Babylon Private Key for in-memory signing: const privateKey = PrivateKey.EcdsaSecp256k1(hdMasterSeed.masterNode().derive(derivationPath).privateKey.toString()); Compiling the node from source If you wish to compile the node from source, there is some documentation here, under the RCnet-V1.0 release notes (https://github.com/radixdlt/babylon-node/releases/tag/rcnet-v1-c6360105d) . ## Olympia to Babylon migration, address mapping and reconciliation URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/updating-from-olympia/olympia-to-babylon-migration-address-mapping-and-reconciliation Updated: 2026-02-18 Summary: There was an automatic migration of state from Olympia to Babylon on release day in September 2023. At switchover time, the Olympia network stopped, the Babylon Integrate > Integrate with Radix > Updating from Olympia > Olympia to Babylon migration, address mapping and reconciliation — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/olympia-to-babylon-migration-address-mapping-and-reconciliation.md) There was an automatic migration of state from Olympia to Babylon on release day in September 2023. At switchover time, the Olympia network stopped, the Babylon nodes read the end state from their Olympia node, and used it to create the Babylon genesis. As more validators completed the process, the Babylon network then sprang into life. To read more on the mechanics of the switchover, see this blog here: Babylon Automatic Migration Guide (https://www.radixdlt.com/blog/babylon-automatic-migration-guide) . As part of the migration (during Babylon mainnet epoch 32717), each entity was recreated with its new address, resources got minted, owner badges were created and passed to their accounts, and stakes got turned into stake units. If you have a need for reconciling the Olympia and Mainnet ledger, please see this spreadsheet: Radix Mainnet - Olympia-Babylon Address Mapping [PUBLIC DOCUMENT]. The spreadsheet captures instructions about the migration, as well as multiple tabs covering the data which was migrated, including mappings between the Olympia and Babylon addresses for all addresses that existed at the end of Olympia. ## Sourcing data from Olympia URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/updating-from-olympia/sourcing-data-from-olympia Updated: 2026-02-18 Summary: Depending on your product, it may be easiest to support current data from Babylon, but require users to upload their Olympia account history CSV, instead of int Integrate > Integrate with Radix > Updating from Olympia > Sourcing data from Olympia — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/sourcing-data-from-olympia.md) - The Olympia Explorer is available at https://explorer.radixdlt.com/ (https://explorer.radixdlt.com/) . Users can download their olympia account history in CSV from the explorer. This will run until at least September 2026. - The Olympia Gateway API is available at https://olympia-gateway.radixdlt.com/ (https://olympia-gateway.radixdlt.com/) with API docs at https://radix-olympia-gateway.redoc.ly/ (https://radix-olympia-gateway.redoc.ly/) . This will run until at least September 2026. Depending on your product, it may be easiest to support current data from Babylon, but require users to upload their Olympia account history CSV, instead of integrating the Olympia APIs. For further information, please see https://www.radixdlt.com/blog/babylon-automatic-migration-guide (https://www.radixdlt.com/blog/babylon-automatic-migration-guide) . ## Changes in Babylon from Olympia URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/updating-from-olympia/changes-in-babylon-from-olympia Updated: 2026-02-18 Summary: From an integration perspective, Babylon is very different from Olympia. The thing that will stay is the token balances: Babylon will start automatically when O Integrate > Integrate with Radix > Updating from Olympia > Changes in Babylon from Olympia — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/changes-in-babylon-from-olympia.md) From an integration perspective, Babylon is very different from Olympia. The thing that will stay is the token balances: Babylon will start automatically when Olympia ends, with Olympia’s end state forming Babylon’s genesis. Token balances will be the same, controlled by the same keys, and validator stakes will stay. But integrating with Olympia requires a separation integration. Compared with Olympia, in Babylon: - The state model is different: - State is stored under a key-value style account-model, in a tree-like structure - not a UTXO-model like in Olympia. - The addresses are different: - They are encoded with Bech32m instead of Bech32 and use different human readable prefixes (HRPs). A Babylon account address will start with account_rdx1_________. The XRD address will also change. - Babylon supports both Secp256k1 and Ed25519 curves for key-pairs. We recommend Ed25519 to new integrators, as it has slightly cheaper fees, but you can safely continue to use Secp256k1 keys if you wish - as your Olympia keys will be Secp256k1. - The Radix Engine Toolkit will enable you to switch between Olympia Secp256k1 Account Addresses and Babylon Secp256k1 “virtual” Account Addresses, and derive a Babylon “virtual” Account Address from a Secp256k1 or Ed25519 public key. - The transaction model is different: - The transaction is intent-based, supports offline construction, and does not contain UTXO identifiers - this makes it easy to construct and submit parallel transactions touching the same account (this was not possible in Olympia, where transactions would need to be serialized to prevent UTXO collisions). - The transaction header has a built-in epoch-based expiry mechanism, and will eventually support explicit cancellation. - Transaction replay prevention makes use of the transaction’s intent hash to guarantee the same intent cannot be committed twice. - Transaction construction uses the Radix Engine Toolkit, not the node, and can be done entirely offline. - The API surface is different: - For exchange integrations involving fungible tokens and accounts, we believe the Core API on the Node should be sufficient - this means you should not need to run a Gateway. - We have designed a long-term support (LTS) subset of the Core API which is suitable for exchange integrations. It includes transaction submission, status and result reporting, account balance reads, and account fungible balance history. - The transaction construction flow is different: - You will need to use the Radix Engine Toolkit for transaction construction. - The Radix Engine Toolkit will allow you to create your transaction, determine the necessary hash for you to sign, and create a notarized payload for submission to the network. - The “state version” in the transaction stream will start again from 1: - You will likely want to store Babylon transaction records separately to Olympia transactions. - The epoch numbering however will not restart - the first Babylon epoch will be 1 more than the last Olympia epoch. ## Updating from Olympia URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/updating-from-olympia/updating-from-olympia Updated: 2026-02-18 Summary: Skip this section if you do not need an integration with historic data from Radix’s Olympia network. Integrate > Integrate with Radix > Updating from Olympia — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/README.md) Info Skip this section if you do not need an integration with historic data from Radix’s Olympia network. ## Worked example 4: Node status and monitoring URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-4-node-status-and-monitoring Updated: 2026-02-18 Summary: You can use the following endpoints to monitor status: Integrate > Integrate with Radix > Exchange Integration Guide > Worked example 4: Node status and monitoring — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-4-node-status-and-monitoring.md) You can use the following endpoints to monitor status: - Endpoints on the System API or Prometheus Metrics API (https://github.com/gguuttss/radix-docs/blob/master/run/node/node-maintenance-and-administration/node-setting-up-grafana.md) - Sync status can also be investigated via: - The ledger_clock from /lts/transaction/construction can tell you how far the node’s ledger tip (ie, the ledger_clock) is behind the current time. This can be useful for tracking sync-up. Assuming the network is currently live, this will give you how far the node is behind the network ## Worked example 3: Tracking deposits to specific account URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-3-tracking-deposits-to-specific-account Updated: 2026-02-18 Summary: As mentioned in the Transaction Results section, there is no such thing as a user “transaction type” such as a “transfer” - all transactions make use of a trans Integrate > Integrate with Radix > Exchange Integration Guide > Worked example 3: Tracking deposits to specific account — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-3-tracking-deposits-to-specific-account.md) Reminder As mentioned in the Transaction Results section, there is no such thing as a user “transaction type” such as a “transfer” - all transactions make use of a transaction manifest, and could do anything, such as call DeFi components. Instead - we encourage you to think about the transaction’s resultant balance changes. For example, a transaction where account X gains 200 XRD could be interpreted as a transfer to account X. By using the /core/lts/stream/account-transaction-outcomes you can filter the transaction stream to only transactions which somehow referenced the given account address. For each account you are tracking, store a last_state_version and poll this endpoint from that state version - updating the last_state_version if a new transaction is returned. You can then read out any relevant fungible balance changes from that transaction, possibly filtering to balance changes under the given account. Note A transaction may reference an account, but might not have any fungible balance changes in the transaction. EG if a NF resource was deposited to the account. This may result in the transaction appearing in this feed under an account, without it having any fungible balance changes against that account. ## Worked example 2: Tracking deposits to any account URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-2-tracking-deposits-to-any-account Updated: 2026-02-18 Summary: By using the /core/lts/stream/transaction-outcomes you can ingest the transaction stream from the Babylon ledger. Integrate > Integrate with Radix > Exchange Integration Guide > Worked example 2: Tracking deposits to any account — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-2-tracking-deposits-to-any-account.md) By using the /core/lts/stream/transaction-outcomes you can ingest the transaction stream from the Babylon ledger. Store a from_state_version and poll this endpoint from that state version - updating the from_state_version to be the next transaction to read as you go. The returned LtsCommittedTransactionOutcome for each transaction includes its status (Committed Success or Committed Failure) as well as the resultant balance changes (due to eg fee payments or vault withdraws/deposits) of the transaction - grouped by the global entity (eg accounts or components) which owned the vault/s which were updated (directly or indirectly). Negative balance changes can be interpreted as withdrawals from the global entity, and positive balances can be interpreted as deposits. Note It is possible that a transaction can have multiple withdrawals from different global entities, and multiple deposits to different global entities. A transaction which has account A paying the XRD fee and having a single withdrawal of resource R, and another account B having a deposit of resource R, could be interpreted as a simple transfer of resource R. But more complicated transactions are possible, and should be handled. ## Worked example 1: Transfer transaction submission flow URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-1-transfer-transaction-submission-flow Updated: 2026-02-18 Summary: Then, use the Toolkit for exchanges to build the transaction: Integrate > Integrate with Radix > Exchange Integration Guide > Worked example 1: Transfer transaction submission flow — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-worked-example-1-transfer-transaction-submission-flow.md) Note See here for example code (https://github.com/radixdlt/typescript-radix-engine-toolkit/blob/main/examples/core-e2e-example/main.ts) for this worked example. As preparation: - Know which account you wish to be transferring from/to, the resource, and quantity. - Use the /core/lts/transaction/construction Core API endpoint to find the current epoch. You should check that the ledger_clock field is close to the current time to ensure that the node is synced up, and that the current epoch is accurate. - Check the account’s current XRD balance with /core/lts/state/account-fungible-resource-balance to ensure you have sufficient XRD balance to pay fees from the account, and any resource you wish to transfer. - Use the /core/lts/state/account-deposit-behaviour Core API to check whether the target account currently accepts deposits (../../../reference/radix-engine/native-blueprints/account.md#configuring-account-deposit-modes-and-resource-preference-map) of the resource you are transferring. - This endpoint returns all the details explaining why the given transfer would be accepted (or not). Please expand the response schema of its API documentation (https://radix-babylon-core-api.redoc.ly/#tag/LTS/paths/~1lts~1state~1account-deposit-behaviour/post) to see the summary of all relevant transfer rules. - If the API tells you that the account does not currently accept the deposit, you should inform the user to update their account settings. Then, use the Toolkit for exchanges to build the transaction: - Use the SimpleTransactionBuilder to build a transaction with the fee-payer account address / notary corresponding to the public key of the sender account. An epoch is approximately 5 minutes. - Note: The SimpleTransactionBuilder uses the “notary is signer” flag and 0 signatories. So notarizing is equivalent to signing the transaction. - The builder will return a hash_to_sign which will be signed by the corresponding private key. - The toolkit can then be used to compile the notarized transaction, and returns you the: - Notarized transaction payload (for submission) - Transaction Identifier (also known as the intent hash). You can then submit the notarized transaction payload to /core/lts/transaction/submit in the Core API, and then poll the /core/lts/transaction/status endpoint with the transaction intent hash to look at the current status of the transaction. Important Please read the Transaction Outcome section of this document. As well as the success case “Committed Success”, there are a number of different failure cases which are possible and will need to be handled: “Temporary Rejection”, “Permanent Rejection”, and “Committed Failure”. Typically you will want to handle these in the following ways: - Temporary Rejection: You may wish to wait or resubmit the same transaction (with the same transaction intent / transaction identifier). - Be careful: the transaction may still be able to be committed successfully! For example - if not enough XRD is available to lock a fee in the account, the transaction will be marked as a temporary rejection. Because if the account is topped up, the transaction might still go through. - Eventually this transaction will be permanently rejected because its “max epoch” that was configured during transaction construction will have passed. - You may wish to tune the max epoch so that transactions permanently reject sooner. Each epoch lasts about 5 minutes. - Permanent Rejection or Committed Failure: The transaction at this stage cannot be committed successfully. You will need to remedy the reason for failure - EG the account doesn’t have enough XRD to pay for fees - and then build/sign a new, replacement transaction - which will have a new transaction identifier. If committed (as either a success or failure), the status endpoint will return a (resultant) state_version for the transaction. If you wish, that can be used with the /core/lts/stream/transaction-outcomes endpoint (as the from_state_version with limit 1) to return the balance changes of the transaction. ## Detecting deposits from different users URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-detecting-deposits-from-different-users Updated: 2026-02-18 Summary: There are two main approaches: Integrate > Integrate with Radix > Exchange Integration Guide > Detecting deposits from different users — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-detecting-deposits-from-different-users.md) There are two main approaches: - Approach 1: Generate new account addresses for each user (better UX) - Generate a new public key pair for the given user, and derive a virtual account address for it from the Toolkit. Store this account to user correspondence, and set up tracking for this account. - OPTIONAL: Consider sending a transaction to update the account’s deposit rules so that only the supported resources can be deposited there (e.g. XRD). This would prevent users accidentally sending the wrong resource to the account. See docs (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) . - On your “Deposit” page: - Show this account address - Show the QR code with content of radix: where should be replaced by the account address that has been generated for the user. - Direct the user to initiate a transfer to that address from their wallet. - Monitor deposits into these generated account addresses (see Worked Example 2), and use the stored address to user mapping to resolve the user who deposited. - Approach 2: Use a single address with messages to distinguish the user Warning Encrypted messages will not be implemented in the wallet for mainnet launch. - Have a single account address which receives transfers. - OPTIONAL: Consider sending a transaction to update the account’s deposit rules so that only the supported resources can be deposited there (e.g. XRD). This would prevent users accidentally sending the wrong resource to the account. See docs (https://github.com/gguuttss/radix-docs/blob/master/reference/radix-engine/native-blueprints/account.md) . - On your “Deposit” page: - Show a required message and this account address, and make clear to the user that they must input this message with their transfer. - Show the QR code with content of radix: where should be replaced by the account address. - Direct the user to initiate a transfer to that address from their wallet, and include the message. - Monitor deposits into the account address (see Worked Example 3), and use the attached message to resolve the user who deposited. ## Integration: LTS Toolkit URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-lts-toolkit Updated: 2026-02-18 Summary: The Radix Engine Toolkit (RET)(../../radix-engine-toolkit/README.md) is available for a number of different languages: TypeScript, Python, Swift, C#, Kotlin (JV Integrate > Integrate with Radix > Exchange Integration Guide > Integration: LTS Toolkit — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-lts-toolkit.md) The Radix Engine Toolkit (RET) (https://github.com/gguuttss/radix-docs/blob/master/integrate/radix-engine-toolkit/README.md) is available for a number of different languages: TypeScript, Python, Swift, C#, Kotlin (JVM) and Rust. It is a library for building Radix Transactions. The Typescript toolkit is particularly useful, as it includes an “LTS” interface. This offers a simplified API designed for exchanges and is available in Typescript for running in NodeJS (https://github.com/radixdlt/typescript-radix-engine-toolkit/blob/main/LTS.md) . Examples in the rest of this document will be written assuming the TypeScript LTS RET is being used. ## Integration: LTS Core API URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-lts-core-api Updated: 2026-02-18 Summary: The Core API specification is linked to here. Integrate > Integrate with Radix > Exchange Integration Guide > Integration: LTS Core API — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-lts-core-api.md) The Core API specification is linked to here (https://radix-babylon-core-api.redoc.ly/) . The Core API is a JSON-RPC API on a Babylon node, conventionally exposed on port 3333. Each request to the Core API (except the) requires the network field, which requires you to enter the logical name of the network of the node, to ensure you are correctly configured. For debugging, you can query for that from the /core/status/network-configuration endpoint. The Typescript Core API SDK handles this for you. In terms of API clients: - We provide a Typescript Core API SDK on npm as @radixdlt/babylon-core-api-sdk (https://www.npmjs.com/package/@radixdlt/babylon-core-api-sdk) , with some examples in the README (https://github.com/radixdlt/babylon-node/tree/main/sdk/typescript) . - We don’t have current plans to provide Core API clients in other languages before mainnet launch. You may be able to generate your own client from the Core API Open API schema, but many Open API generators are quite buggy, so it may be easier to write the models yourself. Documentation on each endpoint is available on Redocly, through the Core API specification link above (https://radix-babylon-core-api.redoc.ly/) . We also provide worked examples of integrations below. If there is functionality which you need and which is missing from the LTS Core API, please let us know and we may be able to add this. ## Development set-up for developing an integration URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-development-set-up-for-developing-an-integration Updated: 2026-02-18 Summary: The recommended development set-up is to either: Integrate > Integrate with Radix > Exchange Integration Guide > Development set-up for developing an integration — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-development-set-up-for-developing-an-integration.md) The recommended development set-up is to either: - Run a docker node locally via docker-compose, synced to a test network by cloning the babylon-node (https://github.com/radixdlt/babylon-node) repository by following the instructions in the testnet-node folder (https://github.com/radixdlt/babylon-node/tree/main/testnet-node) . - Run a shared testnet node in your infrastructure which you can connect to. See the Test Networks section (or follow the link in the previous bullet point) for more details on how to configure this node. You will then need to develop integrations against: - The LTS Core Api - see the Integration: LTS Core API section. - The Toolkit for Exchanges - see the Integration: LTS Toolkit for Exchanges section. - Any custom data sources/infrastructure that you wish to push data into. To test sending transfers as a user would: - You may wish to install the wallet and connector extension (https://docs-babylon.radixdlt.com/main/getting-started-developers/wallet/wallet-and-connecter-installation.html) . - When you have created an account, you can retrieve test XRD through the wallet. ## Infrastructure set-up URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-infrastructure-set-up Updated: 2026-02-18 Summary: Assuming that the Core API satisfies your integration requirements, then we’d recommend running: Integrate > Integrate with Radix > Exchange Integration Guide > Infrastructure set-up — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-infrastructure-set-up.md) Assuming that the Core API satisfies your integration requirements, then we’d recommend running: - Two mainnet nodes: A primary node, and a hot back-up, which can be swapped to if the primary node has issues. - A testnet node which can be used for testing new integrations or upgrades. This can be taken down when not in use. To configure a node for production, you have a few options: - You can configure an Ubuntu VM to run a node using the NodeCLI - details found on our documentation site here (https://github.com/gguuttss/radix-docs/blob/master/run/node/README.md) . - It is also possible to run the docker image in a container orchestrator - by adapting the docker compose from the testnet-node configuration (https://github.com/radixdlt/babylon-node/blob/main/testnet-node/docker-compose.yml) for your set-up. - You can compile your own node from source (https://github.com/gguuttss/radix-docs/blob/master/run/node/README.md) and see these release notes (https://github.com/radixdlt/babylon-node/releases/) explain about some differences / tweaks to get it working for Babylon. ## Exchange Integration Guide URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/exchange-integration-guide/exchange-integration-guide Updated: 2026-02-18 Summary: The following guide will take you through recommendations for how to integrate with the Radix ledger. Integrate > Integrate with Radix > Exchange Integration Guide — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/README.md) The following guide will take you through recommendations for how to integrate with the Radix ledger. This includes use of simplified tools (Core LTS API and the Toolkit for exchanges) which provide easy interfaces where you only care about fungible transfers, and sending fungible resources between accounts. We also suggest reading through the Technical Concepts (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/README.md) documentation for information on concepts described in this guide. Overview Your production integration will typically involve: - A production node in full node configuration, connected to mainnet. - You should also run a back-up node for hot-swapping if the main node has issues. - Deployed custom code which connects to your infrastructure, and Radix infrastructure. It would typically make use of: - An integration with a Radix Engine Toolkit library for constructing/signing/finalizing transactions. - We have created a high-level “Toolkit for exchanges” extension to the toolkit wrappers, with an API which will be compatible for mainnet launch and beyond. - This “Toolkit for exchanges” extension is designed to make it easy to construct fungible token transfers between single-signer accounts. - An integration with the Core API of your node. - We have created an /lts/* sub-api, designed to be simple for exchanges to use, and compatible for mainnet launch and beyond. ## Introduction to Radix at Babylon URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/introduction-to-radix-at-babylon Updated: 2026-02-18 Summary: The Radix Public Network is a Layer-1 Decentralized-Finance (DeFi) platform. Radix has been designed and built from the ground up over many years to be a full-s Integrate > Integrate with Radix > Introduction to Radix at Babylon — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/introduction-to-radix-at-babylon.md) The Radix Public Network is a Layer-1 Decentralized-Finance (DeFi) platform. Radix has been designed and built from the ground up over many years to be a full-stack for the mainstream adoption of DeFi. This is most easily understood by watching our RadFi keynote (https://www.radixdlt.com/radfi) , which explains the problems we see with DeFi right now, and how the Radix platform can enable: - A mainstream-ready user experience - A bespoke programming language and execution engine which allows for simple building of secure decentralized applications - A network which can support unlimited scalability at our Xi’an release There are three key milestones in the Radix journey: - Olympia - Released in July 2021, but now has ended, with all data migrated to Babylon. - Simple UTXO-based state/transaction model supporting fungible resources, accounts, validators and staking. - Babylon - Released September 27 2023. Currently live. - A complete rework of the network’s state and transaction model, using a more intuitive account-based state model and intent-based transaction model, with support for offline transaction construction. - This release brings full smart-contract style functionality, built to run in the Radix Engine. The Radix Engine is an execution environment built from scratch for DeFi, which is not EVM compatible - a decision which enables it to be tailored for DeFi security, user experience, and network throughput. - Xi’an - Planned as the next major milestone after Babylon. - Xi’an will bring infinitely scalable, multi-sharded consensus to the Radix network with the Cerberus consensus protocol. This guide will help you to prepare for an integration with the Radix Babylon Network. Given that Olympia is no longer live, most integrators do not need to worry about Olympia. In which case, you should start at Babylon Technical Concepts (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/README.md) . Some integrators (e.g. auditors, tax integrations, etc) may also care about ingesting data from Olympia, please also see the Integrations with the historic Olympia ledger (integrations-with-the-historic-olympia-ledger) section if this applies to you. ## Integrate with Radix URL: https://radix.wiki/developers/legacy-docs/integrate/integrate-with-radix/integrate-with-radix Updated: 2026-02-18 Summary: This section is a guide for those looking to build programmatic integrations with the Radix Network. It is intended for exchanges and integrators to understand Integrate > Integrate with Radix — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/README.md) This section is a guide for those looking to build programmatic integrations with the Radix Network. It is intended for exchanges and integrators to understand how Babylon works in order to prepare an integration with Radix’s mainnet.  A broad overview of the Radix technical landscape at Babylon can be found in Technical Concepts (https://github.com/gguuttss/radix-docs/blob/master/reference/babylon-technical-concepts/README.md) (formerly a part of this guide). This integration guidance is focused towards constructing and tracking fungible transfers to and from accounts. Index - Introduction to Radix at Babylon (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/introduction-to-radix-at-babylon.md) - Exchange Integration Guide (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/exchange-integration-guide/README.md) - Infrastructure set-up (infrastructure-set-up) - Development set-up for developing an integration (development-set-up-for-developing-an-integration) - Integration: LTS Core API (integration-lts-core-api) - Integration: LTS Toolkit (integration-lts-toolkit) - Detecting deposits from different users (detecting-deposits-from-different-users) - Worked example 1: Transfer transaction submission flow (worked-example-1-transfer-transaction-submission-flow) - Worked example 2: Tracking deposits to any account (worked-example-2-tracking-deposits-to-any-account) - Worked example 3: Tracking deposits to specific account (worked-example-3-tracking-deposits-to-specific-account) - Worked example 4: Node status and monitoring (worked-example-4-node-status-and-monitoring) - Updating from Olympia (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/README.md) - Changes in Babylon from Olympia (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/changes-in-babylon-from-olympia.md) - Sourcing data from Olympia (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/sourcing-data-from-olympia.md) - Olympia to Babylon migration, address mapping and reconciliation (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/updating-from-olympia/olympia-to-babylon-migration-address-mapping-and-reconciliation.md) - FAQs (https://github.com/gguuttss/radix-docs/blob/master/integrate/integrate-with-radix/integrate-with-radix-faqs.md) ## Integrate URL: https://radix.wiki/developers/legacy-docs/integrate/integrate Updated: 2026-02-18 Summary: Legacy documentation: Integrate Integrate — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/integrate/README.md) ## Ecosystem Tools List URL: https://radix.wiki/developers/legacy-docs/use/ecosystem-tools-and-libraries Updated: 2026-02-18 Summary: Here you'll find a list of external dapps, libraries and other tools from across the Radix ecosystem that you may find useful. Use > Ecosystem Tools List — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/use/ecosystem-tools-and-libraries.md) Here you'll find a list of external dapps, libraries and other tools from across the Radix ecosystem that you may find useful. These are unvetted Inclusion in this list does not imply endorsement or make any claims of correctness or stability. Please look at the individual service for more details, and judge for yourself if it fits your needs. If you are are responsible for providing an tool service and would like it to be included here, please get in touch on Discord or at hello@radixdlt.com (mailto:hello@radixdlt.com) . Analytics, Dashboards and Explorers - Official Dashboard (https://dashboard.radixdlt.com/) ( Stokenet (/contents/tech/releases/stokenet) ) - The official Radix dashboard and explorer. - RadXplorer (https://www.radxplorer.com/) : A community explorer, covering transactions, accounts, assets, validators. - RadixCharts (https://radixcharts.com/) : Analytics platform providing statistics, rankings and charts for the Radix DLT ecosystem. - RadixScan (https://www.radixscan.io/) : A community explorer, covering transactions, accounts, validators, assets and .xrd domains. - StakeSafe Validator Dashboard (https://validators.stakesafe.net/) ( Stokenet (https://stokenet.stakesafe.net/) ): A detailed overview of the validator set. - Radix List (https://www.radixlist.com/) : A listing of ecosystem projects on Radix. - Node Manager (https://nodemanager.addix.meme/) by ADDIX - Track XRD flow in/out of nodes at a glance, Get alerts for sudden fee changes, Manage stakes with ease, Node owners can now pull shares directly from the UI APIs and Oracles - Radix API (https://radixapi.net/) : Provides additional endpoints and functionality to the official Gateway API to make it easier to obtain specific data. - Astrolescent API (https://docs.astrolescent.com/astrolescent-docs/decentralized-exchange-aggregator/integrations/api) : Find tokens, prices and swap. - Supra Oracles (https://supraoracles.com/) (Coming soon): Offers "push" and "pull" oracle services to deliver precise and reliable price data for more than 100 of the most popular crypto asset pairs. - Religant Oracles (https://religantoracles.com/) : The Religant Oracle protocol enables a decentralized feed of price data. Custody - Copper (https://copper.co/products/digital-asset-custody) : Copper provides custody of an impressive and expanding range of digital assets, all insured and held with the highest security standards. - Bitpanda (https://www.bitpanda.com/en/bitpanda-spotlight) : Bitpanda has over 4 million users across 40 countries, opening Radix up to more avenues for adoption within the Web3 community. Bitpanda’s BEST (Bitpanda Ecosystem Token) holders have additional opportunities thanks to this integration. BEST holders are eligible to subscribe to $XRD and receive giveaway tokens. - Ledger (https://learn.radixdlt.com/article/using-ledger-nano-s-s-plus-or-x-with-the-radix-wallet) : The Radix Wallet can connect to a Ledger Nano S, S Plus, or X hardware wallet and create a special hardware wallet account. This account is protected by the Ledger Nano – even if the Wallet or the PC it is running on is untrusted or compromised in some way. For this account, all transactions must be approved and signed on the Ledger device while connected to a PC that is linked to the Radix Wallet via the Radix Connector browser extension. - Arculus card (https://www.getarculus.com/) (soon): This integration will bring two exciting benefits to the Radix community: a highly convenient way of adding hardware-based security to the Radix Wallet’s coming multi-factor authentication (MFA) account control and an expansion of the reach of the XRD token with a new multi-network mobile wallet app option from Arculus. RPC - Grove (https://www.grove.city) : Grove is a pioneer of decentralized RPC infrastructure who has integrated the POKT Network with Radix, adding Radix to their extensive list of supported protocols. With this integration, developers can focus more on building their applications without the hassle of maintaining their own nodes or RPC infrastructure. - NowNodes (https://nownodes.io/radix) : NOWNodes is a Blockchain-as-a-Service solution that provides access to Full Nodes, Explorers and Blockbooks via API. 65+ blockchain networks supported. Choose the node that you need, use available methods and connect. On-off ramp - Alchemy Pay (https://alchemypay.org) : Alchemy Pay supports payments in 173 countries and accepts over 50 Fiat currencies via 300+ payment channels. Users can effortlessly convert fiat currencies to XRD and vice versa. - Switcher (https://switchere.com/) : Switchere supports any credit, prepaid, or debit card issued by VISA, Mastercard, or Maestro. Instapass and Instabridge Instabridge (https://www.instabridge.io/) is the first option eXRD token holders have for converting eXRD to XRD for use on the Radix Public Network. If you would like to use Instabridge to convert your eXRD to XRD, you will first need to ensure you have a verified Instapass (http://instapass.io/) account. A guide for connecting your Ethereum (eXRD) or Radix (XRD) wallets to your Instapass account can be found here (https://www.instapass.io/) . Maya - Maya Protocol (https://www.mayaprotocol.com/) : Maya Protocol is a platform specializing in decentralized liquidity and enabling seamless swaps across various blockchains. Once the integration is complete, users of Maya and the Radix Network will be able to trustlessly exchange assets cross-chain without compromising on security. - THORSwap (https://www.thorswap.finance/) : Thorswap enables anyone to interact with liquidity across 15 blockchains, 3 cross-chain protocols and 11 Decentralized Exchanges/ Aggregators. Perform cross-chain swaps between 5,500+ assets (including Native Bitcoin) in a single permissionless, non-custodial transaction. Scrypto Libraries - scrypto-avltree (https://github.com/ociswap/scrypto-avltree) by Ociswap: A sophisticated new scalable data structure as alternative to the Btreemap that can be used to house an order book exchange or concentrated liquidity positions. - scrypto-math (https://github.com/ociswap/scrypto-math) by Ociswap: Brings advanced math operations such as exponential, logarithm, and power functions to any Scrypto blueprint that loads the scrypto-math module as a dependency. - radix-event-stream (https://github.com/ociswap/radix-event-stream/tree/main) by Ociswap: An extensible Rust library to help you identify and process events from the Radix ledger. - Scrypto Toolkit (https://github.com/BeakerTools/scrypto-toolkit) by Avincigu : This library aims at providing open source tools for Scrypto developers including Test Engine and Maths library Tools by the Community - Instruct Manifest Builder ( Mainnet (https://instruct.radixbillboard.com/) | Stokenet (https://instruct-stokenet.radixbillboard.com/) | Code (https://github.com/radixbillboard/instruct-mainnet-site) ) by Radix Billboard: It allows users to easily and flexibly create custom instructions to be submitted to the network. Not only does it support relevant manifest instructions, it also supports arbitrary Scrypto blueprint schemas. - Manifest Checker ( Mainnet (https://ilikeitstable.com/manifest-checker) | Stokenet (https://ilikeitstable.com/manifest-checker-stokenet) ) by Stabilis: It validates your manifest and runs a preview, allowing easy debugging of issues with user-build manifests. - Xidar Validator Manager ( Mainnet (https://my.xidar.io/tools/validator-manager) ) by Xidar: It allows validator runners to take standard maintenance actions - Manifest Builder ( Mainnet (https://tmbuilder.stakingcoins.eu/) | Code (https://github.com/nemster/tmbuilder) ) by Marco Michelino | Staking Coins: It features a variety of functionalities, including adding and withdrawing liquidity from Ociswap, depositing and retrieving LSUs from CaviarNine, redeeming WEFT, and facilitating swaps on Ociswap, DefiPlaza, AlphaDex, and RadixPlanet. It is important to note that this tool is intended for educational purposes and should be used with caution. Test it on Mainnet. - Desktop Tool ( Code (https://github.com/atlantis-l/Radix-Desktop-Tool) ) by Hold: The Radix Desktop Tool includes multiple functionalities, such as multiple-to-multiple token transfers, single-to-multiple token transfers, multiple-to-single token transfers, manifest execution, package deployment, token creation, account generation, asset checking, history reviewing, entity verification, format conversion, address QR code generation, and XRD faucet in Stokenet. - ERC-404 using Scrypto ( Code (https://github.com/aus87/ice_rrc404v1) ) by Austin.XRD: ERC-404 is a token standard that combines the features of fungible (ERC-20) and non-fungible (ERC-721) tokens into a hybrid form that enables both unique and divisible asset representations on the blockchain. Inspired by this innovative yet complex token standard, Austin.XRD set out to simplify this using Radix Native Assets, this new token standard on Radix is called RRC-404. Check out the github repository. ## Radix Wallet Overview URL: https://radix.wiki/developers/legacy-docs/use/radix-wallet-overview Updated: 2026-02-18 Summary: Please visit for a step-by-step guide on how to get up and running with the Radix Wallet. Use > Radix Wallet Overview — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/use/radix-wallet-overview.md) Please visit https://wallet.radixdlt.com (https://wallet.radixdlt.com) for a step-by-step guide on how to get up and running with the Radix Wallet. Or, watch this video walkthrough of the process: https://www.youtube.com/embed/N0utev66iAg (https://www.youtube.com/embed/N0utev66iAg) ## Radix Dashboard (Explorer+) URL: https://radix.wiki/developers/legacy-docs/use/radix-dashboard Updated: 2026-02-18 Summary: After you have linked Radix Wallet and Connector, the easiest place to see how they work with dApps is to visit the Radix Dashboard or Radix Console( Use > Radix Dashboard (Explorer+) — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/use/radix-dashboard.md) Using the Dashboard dApp with the Radix Wallet After you have linked Radix Wallet and Connector, the easiest place to see how they work with dApps is to visit the Radix Dashboard (https://dashboard.radixdlt.com) or Radix Console (https://console.radixdlt.com) . For those familiar with the preceding Olympia network, the Radix Dashboard replaces the Radix Explorer at Babylon. Like the Explorer, the Dashboard lets you search for transactions, accounts, and more. It also expands on these capabilities as a real Web3 dApp where you can connect your Radix Wallet directly, to let users perform common network activities such as staking. The Console supplements this by offering utility pages for developer use, including sending raw transaction manifests, and package deployment. In the upper right corner is the √ Connect button. This is a web component that any developer can use and import into their dApp website project that makes it easy to integrate to the Radix Wallet, and gives users an immediate indication that this is a Web3 site where they can use the Radix Wallet. The Radix dApp Toolkit (https://github.com/gguuttss/radix-docs/blob/master/build/build-dapps/dapp-application-stack/dapp-sdks/dapp-toolkit.md) provides a seamless way for developers to implement their own √ Connect button. Click that button, select “Connect Now” and the Connector extension will automatically attempt to find your Radix Wallet. Open the Radix Wallet on your phone, and you should receive a connection request from the Dashboard dApp. For the initial version of the Dashboard, all it needs from the wallet is a list of whichever accounts you want to use with it. (For a more complex website, it might have included a request for a Persona login or other things.) Choose some, or all, of your accounts, click Continue, and your wallet will provide those addresses back to the Dashboard – and you’ll see that the √ Connect button now shows you as connected! Now if you go to other Dashboard/Console pages to stake to validators (https://dashboard.radixdlt.com/network-staking) , deploy a package (https://console.radixdlt.com/deploy-package) , or send a raw transaction manifest (https://console.radixdlt.com/transaction-manifest) . The Dashboard can use those addresses from your wallet to populate form inputs and construct transactions to send to your wallet. During development, you will likely want to use Stokenet, the Radix test network, where you can freely obtain XRD directly in your wallet for testing purposes. Don’t forget to switch your wallet’s Gateway setting to Stokenet, and then visit the Stokenet Dashboard (https://stokenet-dashboard.radixdlt.com) or Stokenet Console (https://stokenet-console.radixdlt.com) . ## Use URL: https://radix.wiki/developers/legacy-docs/use/use Updated: 2026-02-18 Summary: Legacy documentation: Use Use — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/use/README.md) ## Getting Help URL: https://radix.wiki/developers/legacy-docs/essentials/getting-help Updated: 2026-02-18 Summary: If you are looking for more general answers about Radix, the Radix Knowledge Base( Essentials > Getting Help — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/essentials/getting-help.md) Learn If you are looking for more general answers about Radix, the Radix Knowledge Base (https://learn.radixdlt.com/?_gl=1*shuqay*_ga*MTIzMTQzODAzMS4xNjgxNDA1NDI2*_ga_MZBXX3HP5Q*MTY5NTY2MDI0MS4yMzUuMS4xNjk1NjY2MzEyLjMwLjAuMA..) provides an ever-expanding resource of frequently asked questions and guides. Ask and Discuss Radix is an open source, community supported project. Whether you are a user, developer, or node-runner, the best place to go to get live answers to your questions is the Radix Discord server (https://go.radixdlt.com/Discord) . There you’ll find both members of the Radix community and Radix developer team who can assist. Get Ready to Build If you are a developer interested in building on Radix, head over to the Radix Developers Hub (https://developers.radixdlt.com/) where you can sign up for our developer mailing list and much more. Connect The Radix ecosystem is also supported by the Radix Foundation - a non-profit organization of Radix’s original creators and developers. If you have a specific inquiry about integrating the Radix token, building on Radix, or partnering with Radix, please reach out at hello@radixdlt.com (mailto:hello@radixdlt.com) . ## Authorization Approach URL: https://radix.wiki/developers/legacy-docs/essentials/authorization-approach Updated: 2026-02-18 Summary: The traditional blockchain security pattern of deciding access control based on who or what called you has lead to enormous sums of money lost to clever privile Essentials > Authorization Approach — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/essentials/authorization-approach.md) The traditional blockchain security pattern of deciding access control based on who or what called you has lead to enormous sums of money lost to clever privilege escalation attacks on other crypto networks, so Radix was designed with a different model in mind. Radix adopts a role-based access control pattern in which developers set the rules about which roles are able to perform certain actions (e.g., accessing an administrator-only method on a component, or minting more supply of a resource), and the system enforces that the proper role was met before permitting the action to occur. Badges are the mechanism by which actors demonstrate that they are able to meet a role. Badges can be proofs of resources (like showing that you have access to a membership token), or they can be virtual things created automatically by the system (like a proof that a certain signature was present in the transaction). Developers have fine-grained control over the roles they create, allowing them to choose whether they are updatable or immutable on an individual basis, and these settings are all visible to on- and off-ledger tools. This means consumers can understand the security pattern of an unfamiliar component or resource without having to delve into reading any code. Unique in the crypto space, Radix's authorization model finally allows for a proper separation of concerns between your business logic and your access rules, giving your applications the flexibility to start with a simple control scheme and grow into something more complex later on—without having to hand-roll a bespoke solution, and without having to rely on the demonstrably dangerous "what's the source of this request" pattern which plagues the industry. ## Transactions on Radix URL: https://radix.wiki/developers/legacy-docs/essentials/transactions-on-radix Updated: 2026-02-18 Summary: On other smart contract platforms, transactions are typically a one-trick pony: sending a method call to a smart contract. The result of that call is some combi Essentials > Transactions on Radix — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/essentials/transactions-on-radix.md) On other smart contract platforms, transactions are typically a one-trick pony: sending a method call to a smart contract. The result of that call is some combination of updating the smart contract’s internal state and often invisibly calling methods on other contracts to coordinate some state changes. The sequence of steps is locked-in; all your transaction can do is pick an entry point and the rest is out of your hands. On Radix, transactions describe a series of calls to different on-ledger components, passing data and resources between them. This sequence of calls is called a transaction manifest, and it describes exactly the steps that must occur in order for the transaction to complete successfully. This composition of various on-ledger items is atomic—either everything happens together, or nothing happens. There's no chance of a transaction executing only a subset of the steps and leaving you with an unexpected half-way result. Transaction manifests support user-defined guarantees, added by client tools (like a wallet), which provide network assurance that outcomes will be as expected. There is no need for components to build in handling for things like slippage and sandwich trading...on Radix that's all handled at a higher layer, and the user is in control. Builders of web sites which integrate the use of multiple different components do not need to deploy any code in order to achieve custom sequences of calls; combinations can be brought together, on-the-fly, directly within the manifest, and customers can always count on their wallet software to lay out what is happening within the transaction, and guarantee the outcomes they care about. ## Reusable Code URL: https://radix.wiki/developers/legacy-docs/essentials/reusable-code Updated: 2026-02-18 Summary: Radix splits the concept of "smart contract" into two parts: blueprints and components. Essentials > Reusable Code — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/essentials/reusable-code.md) Radix splits the concept of "smart contract" into two parts: blueprints and components. Blueprints are similar to classes in object-oriented programming. They describe what state a component will manage, and implement the various methods which will be used to manage that state. Components are instantiated from blueprints, just like objects are instantiations of a class. Multiple components can be instantiated from a single blueprint, and each will have its own state but follow the shared logic implemented in the blueprint. For example, you might create a RedeemToken blueprint which is expected to give out some token in exchange for a different input token. You could then instantiate a component from that blueprint which expected to receive ZOMBO and hand out CERB, and instantiate another component which expected to receive GUMBLE and hand out COIN. Your instantiation parameters would define the expected resources, and likely supply their pool of resources to hand out as well. Each instantiated component would have its own global address. Because the transaction model allows for ad-hoc combinations of calls to different components, passing data and resources between them, developers are encouraged to write modular blueprints that "do one useful thing well," similar to the renowned Unix development philosophy. To incentivize the creation of reusable blueprints and components, Radix implements a native royalty system which automatically collects any developer-specified "use fees" any time the blueprint or component is used in a transaction. ## Asset-Oriented URL: https://radix.wiki/developers/legacy-docs/essentials/asset-oriented Updated: 2026-02-18 Summary: The entire Radix stack—from the execution environment, to the programming language, to the transaction model, to the wallet—is built to support the safe creatio Essentials > Asset-Oriented — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/essentials/asset-oriented.md) The entire Radix stack—from the execution environment, to the programming language, to the transaction model, to the wallet—is built to support the safe creation, storage, and interaction with native assets, which we call resources. Resources are a first-class primitive in Radix, with guaranteed behaviors that are provided by the system, and powerful customizations available to developers. Resources intuitively follow the same behavior as real-world physical objects, and are transacted with in a physical manner. Resources can be placed in buckets and passed around or returned from a smart contract, just the same as you would with an integer or a string. As we'll see in a later section, the transaction model is built around the orchestration of moving resources around between on-ledger components, in a way that the end-user can understand the consequences of signing a transaction, with the ability to easily add user-defined guarantees to outcomes. Because resources are defined by the system and developers must specify the rules under which things like minting can occur, off-ledger clients like wallets and explorers can look at a given resource on-ledger and know things about how it behaves—without having to read any smart contract code—and notify users appropriately. If you have developed on Ethereum-alikes where tokens are simply smart contracts which meet an interface standard, you are probably accustomed to baking logic directly into the code for your tokens. Programming doesn't work like that on Radix...business logic is strictly separate from the assets themselves, and you'll need to retrain your brain a bit. You'll be glad you did. ## Essentials URL: https://radix.wiki/developers/legacy-docs/essentials/essentials Updated: 2026-02-18 Summary: Regardless of what you're looking to do on Radix, we highly recommend taking just 5 minutes to read through the essential topics in this section, which briefly Essentials — View on GitHub (https://github.com/gguuttss/radix-docs/blob/master/essentials/README.md) Regardless of what you're looking to do on Radix, we highly recommend taking just 5 minutes to read through the essential topics in this section, which briefly describe the core of how it works differently from other networks. Asset-Oriented Radix's comprehensive stack is meticulously designed to enable the secure creation, storage, and seamless interaction of native assets—termed resources—through intuitive transaction models and a clear separation of asset behavior from business logic. More here (https://github.com/gguuttss/radix-docs/blob/master/essentials/asset-oriented.md) Reusable Code Radix redefines smart contracts by dividing them into modular blueprints and components, enabling reusable structures with distinct state management and logic while incentivizing developers through a built-in royalty system. More here (https://github.com/gguuttss/radix-docs/blob/master/essentials/reusable-code.md) Transactions on Radix Radix enhances transaction functionality by utilizing atomic transaction manifests that orchestrate multiple on-ledger component calls, ensuring all-or-nothing execution and empowering users with customizable guarantees. More here (https://github.com/gguuttss/radix-docs/blob/master/essentials/transactions-on-radix.md) Authorization Approach Radix implements a secure role-based access control system using badge-based permissions and transparent, fine-grained rules, effectively separating business logic from access management to prevent privilege escalation attacks. More here (https://github.com/gguuttss/radix-docs/blob/master/essentials/authorization-approach.md) ## XRD Domains URL: https://radix.wiki/ecosystem/xrd-domains Updated: 2026-02-18 Summary: XRD Domains is the primary developer of the Radix Name Service (RNS) and serves as a platform for domain registration and management. XRD Domains was a now-deprecated domain registration and management platform for .xrd names on the Radix network, and a major builder of tooling around the Radix Name Service (RNS). It previously served as the primary user-facing gateway for discovering, acquiring, and administering .xrd domains. In practice, XRD Domains acted as the "dashboard" experience for .xrd owners - abstracting the underlying on-ledger complexity and providing a more approachable interface for handling records / configuring how a domain resolved or behaved across Radix applications. As a result of the Radix Name Service (RNS) community vote, the project adopted a wind-down path for XRD Domains. The vote endorsed moving toward an immutable, fork based method of operation at community level, so a namespace could continue as a neutral, long-lived piece of infrastructure without depending on a single operating team or funding. In practice, that community-approved transition meant the centralized platform work that XRD Domains previously provided, was no longer the intended operating model and a clear cut path forward was necessary. With a new, alternative and open-sourced protocol designed to be durable and registrar-agnostic, XRD Domains was deprecated and subsequently closed as an active service. ## Aure1is URL: https://radix.wiki/community/aure1is Updated: 2026-02-18 Welcome to Aure1is This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## WMD URL: https://radix.wiki/community/wmd Updated: 2026-02-17 Welcome to WMD This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Bayesian URL: https://radix.wiki/community/bayesian Updated: 2026-02-17 ## RFC: Mini Season 2 of Radix Rewards during Foundation Handover URL: https://radix.wiki/ideas/rfc-mini-season-2-of-radix-rewards-during-foundation-handover Updated: 2026-02-17 There are calls to start a smaller season 2 of Radix Rewards during the Foundation handover to community. It would be nice to have a small incentive live to take advantage of any new eyes on our ledger from Timan’s test and related publicity. I would like to hear from others if you think this is a good idea, how much rewards should be involved, and how long it should run? The numbers I have seen floating around are 6 weeks duration, 5-10 million XRD reward. Throwing this RFC together quickly because this seems like something that would need to be voted on quickly, perhaps even alongside the next vote on which token to use for DAO voting participation. ## Quack Space URL: https://radix.wiki/ecosystem/quackspace Updated: 2026-02-14 Your Wallet is Your Voice: 5 Ways Quack.Space (http://Quack.Space) is Redefining Social Financial Identity Introduction: The End of the Silent Portfolio For too long, a structural divide has separated our digital discourse from our financial reality. We live in a fragmented ecosystem where social influence is built on one set of rails while capital is managed on another, creating a persistent disconnect between what we say and what we actually stake. This era of the "silent portfolio" is coming to an end. Quack.Space (http://Quack.Space) represents a profound fusion of the ATProtocol and RadixDLT, eroding the wall between liquidity and legacy social graphs. This integration formalizes a new paradigm: Social Financial Identity (SFI). In this framework, your social graph and reputation are no longer isolated metrics; they are provably linked to your on-chain assets and financial history. By making a user’s financial position an active, transparent component of their social presence, Quack.Space (http://Quack.Space) transforms the wallet from a passive vault into a resonant voice. 1. Your Influence is Now Provably Weighted The most significant shift in the Quack.Space (http://Quack.Space) ecosystem is the transition from "one user, one vote" to Token Weighted Views and Votes. Traditional social platforms are plagued by bot-driven sybil attacks and shallow engagement metrics that fail to distinguish between noise and conviction. Quack.Space (http://Quack.Space) corrects this by allowing engagement to be algorithmically amplified based on specific token holdings within a linked wallet. This introduces a sophisticated meritocracy where influence is backed by skin-in-the-game. When a verified holder engages with content, their signal is demonstrably louder. Crucially, this creates an intrinsic demand for project tokens; if you want your voice to carry weight in a specific community or governance debate, you must hold the underlying asset. It shifts the social dynamic from mere speculation to a verifiable, reputation-backed economy. "Establishes a verifiable, reputation-backed social economy where influence has tangible financial weight." 2. Sovereignty Without Compromise: The ATProto Bridge Digital identity has historically been a captive asset of the platform provider. Quack.Space (http://Quack.Space) dismantles this vendor lock-in through a sophisticated XRD Domain:Handle:DID mapping. This technical primitive uses the Radix ledger as the immutable "source of truth," providing cryptographic ownership of the social identity. By bridging to the Authenticated Transfer Protocol (ATProto), Quack.Space (http://Quack.Space) offers true portability. Users can migrate their followers, posts, and interaction data across any ATProto-compatible host without losing their history or their audience. This isn't just a convenience; it’s a strategic future-proofing of the social presence. In an era of de-platforming and shifting ecosystem allegiances, your social financial identity remains sovereign and portable. 3. The Feed as a Financial Filter Quack.Space (http://Quack.Space) moves social media from a stream of generic noise to a high-signal financial tool. Through the "Filter Feed by Token Holdings" feature, the user experience becomes dynamic. Instead of an algorithm guessing what you like, the feed can be set to prioritize content, polls, and discussions related to the specific assets in your wallet—such as QUACK∗∗or∗∗XRD. This financial curation is reinforced by the "Profile Top $Ticker" display and real-time in-post metrics. When a post mentions a specific ticker, it automatically embeds live pricing, 24-hour percentage changes, and the user's ownership status. This turns every post into a functional data point, ensuring that the context of the conversation is always grounded in the current reality of the market. 4. Coordinated Action via the "Flock Commander" To bridge the gap between social signaling and tangible utility, Quack.Space (http://Quack.Space) introduces the Flock Commander module. This system gamifies community coordination by organizing users into "Missions"—automated "Operations" that range from promoting high-value content to participating in Radix governance proposals. Execution is verified and tracked via linked Telegram bots, aggregating community impact into a metric known as "Flock Power." Participation is further incentivized through "Mission Reports" and "Merits"—reputation points that reward high-quality contribution. This creates a structured reputation economy where a user's status is derived from their history of successful operations and community support, rather than simple follower counts or vanity metrics. 5. Monetizing the Micro-Moment The monetization of social content is often opaque and delayed. Quack.Space (http://Quack.Space) streamlines this via the Tip Cart, Poll Tips, and the Reply Airdrop. These features allow for multi-asset token transfers directly within the social interface, rewarding creators for initiating high-value discussions or contributors for providing insightful replies. The platform measures this impact through two key metrics: • Post ‘Value’: Unlike a simple "like" count, this is a composite score based on tips received, engagement depth, and token-weighted views, adjusted by a time decay factor to ensure that current relevance is prioritized over legacy virality. • Handle Performance: A dynamic assessment of a user's overall contribution history and reputation score within the ecosystem. By turning engagement into tangible on-chain rewards, Quack.Space (http://Quack.Space) provides a transparent financial assessment of content, incentivizing a higher standard of participation. Conclusion: The Dawn of the Reputation Economy Quack.Space (http://Quack.Space) is more than a social network; it is the infrastructure for a reputation-backed social existence. This vision is supported by a growing suite of utilities, including the Radix Rolodex—a "reverse ads" platform utilizing a "deckbuilder" function to generate embeddable widgets (or "Rads") for any webpage—and the Doubt/it! poll system. The latter serves as a direct pipeline between social signaling and the gaming ecosystem at Doubtit.digital (http://Doubtit.digital) , where yes/no polls feed directly into browser and board game content. We are moving away from the era of passive browsing and toward a world where your social presence is a provable extension of your financial convictions. As these lines continue to blur, the question for every participant in the Web3 space becomes: Are you ready for your social influence to be directly tied to your financial skin-in-the-game? ## Radix $BlueBalls URL: https://radix.wiki/ecosystem/radix-blue-balls Updated: 2026-02-14 ## All URL: https://radix.wiki/community/all Updated: 2026-02-14 Welcome to All This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Mehmetihsancelebi URL: https://radix.wiki/community/mehmetihsancelebi Updated: 2026-02-12 Welcome to Mehmetihsancelebi This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Primary URL: https://radix.wiki/community/primary Updated: 2026-02-12 Welcome to Primary This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## ediel URL: https://radix.wiki/community/ediel Updated: 2026-02-11 Welcome to ediel This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Doubt/it! URL: https://radix.wiki/ecosystem/doubt-it Updated: 2026-02-10 Board game, Digital game (Free to play, browser based, multi-player/multi-device), Quack.Space (http://Quack.Space) polls. AND, gameplay + poll integration coming on https://doubtit.digital! Get $DOUBT tips and other tokens by answer polls on quack.space! Buy the original game at doubtitgame.com ## Radix Rolodex URL: https://radix.wiki/ecosystem/radix-rolodex Updated: 2026-02-10 Create Rads for any project and token and let other users create Decks of Rads for them to display via embed codes on sites around the internet and ecosystem! ## Foton URL: https://radix.wiki/ecosystem/foton Updated: 2026-02-10 Summary: 👋 Open Positions:: Project Submission Status: 🟡 👋 Open Positions:: Project Submission ( https://www.notion.so/Project-Submission-ef696c2d7ebd4117ba59ddbeb0817bb5?pvs=21 (https://www.notion.so/Project-Submission-ef696c2d7ebd4117ba59ddbeb0817bb5?pvs=21) ) Status: 🟡 Foton is a Radix-native Non-Fungible Token (NFT) marketplace designed to facilitate a user-friendly experience for creators and traders in the NFT ecosystem. Co-founders Rares and Vlad developed Foton with the aim of combining effective features from various market-leading websites, such as OpenSea, Discord, Shopify, and Amazon, to create a comprehensive NFT marketplace. https://youtu.be/oJZ54zAA3W8?si=2eNJLeLa0cT2K5nI (https://youtu.be/oJZ54zAA3W8?si=2eNJLeLa0cT2K5nI) Overview Foton is a modular NFT marketplace that focuses on the creation, management, and trading of tokenized digital assets for users. The platform supports creators from various backgrounds, including artists, designers, entrepreneurs, influencers, and brands of different sizes. Foton enables these creators to monetize their creations as NFTs and establish connections with their audience or customers. Key Features - Supporting a wide range of creators - Offering an end-to-end solution for launching NFT collections - Functioning as an individual store aggregator - Providing various NFT types - Enhancing the experience for creators and traders Founders Rares Rares, the co-founder and CTO of Foton, has a background in web development, programming, and blockchain technology. His passion for technology and learning contributed to the development of Foton, an NFT marketplace on the Radix Network. Vlad Vlad, the co-founder and CEO of Foton, has a strong background in mathematics, physics, and design. His diverse experiences and problem-solving skills have been instrumental in shaping Foton into a comprehensive platform in the NFT marketplace. https://youtu.be/3wO3Dn5KBf4 (https://youtu.be/3wO3Dn5KBf4) Project History Foton traces its origins to Ideomaker, an end-to-end supply chain platform. The decision to create a separate platform specifically for digital assets emerged towards the end of 2021. Identifying pain points in the NFT industry and recognizing the opportunity to address these issues, Vlad and his team decided to develop Foton. The choice to build Foton on Radix was influenced by the Radix team's vision, determination, and ability to address the Trilemma and create a scalable, modern, decentralized form of money. Building Foton on Radix allows for a more efficient development cycle without the complexities often associated with the crypto industry, enabling Foton to focus on providing utility to its users through a comprehensive set of tools and an improved experience. ## Astrolescent URL: https://radix.wiki/ecosystem/astrolescent Updated: 2026-02-10 Summary: Astrolescent is a project on the Radix Network that focuses on optimizing trades by aggregating liquidity from top decentralized exchanges (DEXes). The primary Astrolescent is a project on the Radix Network that focuses on optimizing trades by aggregating liquidity from top decentralized exchanges (DEXes). The primary product is a decentralized exchange aggregator, which routes trades for minimal price impact. The Astrolescent team (https://docs.astrolescent.com/astrolescent-docs/readme/team) consists of two developers and an advisor, with plans for expansion. https://youtu.be/O7sjjyrKoGk (https://youtu.be/O7sjjyrKoGk) Team Astrolescent was founded by Michael Videtto. He emphasizes the need for regulation (https://www.radixdlt.com/podcast/astrolescent-the-first-fully-collateralised-stablecoin-launching-on-radix) in the crypto industry to prevent scams and build trust, and to make the industry more secure for everyone. The project aims to bring the Radix community together (https://www.radixdlt.com/podcast/astrolescent-the-first-fully-collateralised-stablecoin-launching-on-radix) by offering multiple financial products and rewarding them with airdrops of the $ASTRL token. Towards the Storm, Episode 8: Astrolescent featuring Michael Videtto. (https://youtu.be/MXdM71df_Fo?si=aHrixSOwleZ3T6P4) Towards the Storm, Episode 8: Astrolescent featuring Michael Videtto. Recent Developments In June 2023, Astrolescent announced (https://medium.com/@astrolescent/astrolescent-back-to-our-core-d3dada04ab92) that it was discontinuing its plans for a fully collateralized stablecoin ($USDA) due to the insolvency of their banking partner, Prime Trust. Subsequently, the team decided to focus solely on its decentralized exchange aggregator. As part of this shift, Astrolescent plans to move (https://medium.com/@astrolescent/astrolescent-back-to-our-core-d3dada04ab92) all of its operations from the US to the Netherlands, transition its activities to a newly formed DAO with a governance portal for community voting, and set up revenue sharing of the aggregator fees to the holders of the $ASTRL token. The DAO will offer the community the ability to vote on the future evolution of Astrolescent, including treasury spending decisions. Future Plans and Community Engagement Astrolescent continues its commitment to the Radix community with planned airdrops and development expansions. It has completed its first airdrop (https://medium.com/@astrolescent/development-updates-and-airdrops-c3ad733a07a6) to stakers on several platforms, with more planned for the near future. The team is also looking to hire developers (https://medium.com/@astrolescent/development-updates-and-airdrops-c3ad733a07a6) as they build on Radix and expand their offerings. ## Ociswap URL: https://radix.wiki/ecosystem/ociswap Updated: 2026-02-10 Summary: Ociswap is a decentralized cryptocurrency exchange (DEX) that operates on the Radix platform. It offers permissionless trading of Radix-based tokens through an Ociswap is a decentralized cryptocurrency exchange (DEX) that operates on the Radix platform. It offers permissionless trading of Radix-based tokens through an automated market maker (AMM) model and liquidity pools. Developed in 2021 and launched in 2022, the platform aims to provide a secure and decentralized alternative to traditional cryptocurrency exchanges. https://youtu.be/IlyC8GoNj9E?si=iG4Z3xnW9Yby7jRD (https://youtu.be/IlyC8GoNj9E?si=iG4Z3xnW9Yby7jRD) History Ociswap was launched in November 2021 Ociswap started with an MVP that was initially aimed at providing basic swapping functionalities within the Radix ecosystem. At its inception, no smart contract tools were available for Radix, which limited the initial design. However, the project's team was inspired by CoinGecko's top-100 list and began by listing top Radix projects, adding a swapping service later on. Ocibot launched on the 3rd of April 2023 (https://t.me/ociswap/186709) , an automated community management tool that evolved from a basic airdrop registration bot to a full-featured assistant using Generative Pre-trained Transformers (GPT). Ociswap’s brand identity crystallized during the Radix Grants Program retreat in Spring 2023. A set of questions from RDX Works CEO Piers Ridyard led to the adoption of the slogan "The Front Page of Radix," which has since been instrumental in uniting the project team and community around a shared vision. Team Florian Pieper https://www.youtube.com/watch?v=IJ81EcyF0eg (https://www.youtube.com/watch?v=IJ81EcyF0eg) Lukas Steffen https://www.youtube.com/watch?v=HzcUhwP2yxc (https://www.youtube.com/watch?v=HzcUhwP2yxc) Christoph Heuermann Features Mechanisms - Automated Market Maker (AMM): Trades are facilitated through liquidity pools rather than traditional order books. - Liquidity Pools: Users can contribute tokens to smart contract-based pools and become liquidity providers (LPs). - Price Calculation: Utilizes the constant product market maker (CPMM) formula to determine trade prices. The price of a trade on Ociswap is determined using the constant product market maker (CPMM) formula, which is defined as x * y = k, where x and y are the amounts of the two tokens in the pool, and k is a constant. As users trade, the amount of one token increases while the other decreases, maintaining the constant k. This formula ensures that the price changes according to the size of the trade and the available liquidity in the pool. Tokens - $OCI Token: The native utility token used for rewards, transaction fees, and governance. - $oOCI Tokens: Non-tradable tokens that offer options to buy $OCI at discounted rates. Programs - SPLASH: A liquidity incentive program that boosts rewards for selected pools. Ocinomics A financial and governance system designed to incentivize long-term liquidity provision and community engagement. - veTokenomics: Encourages long-term holding by offering voting rights to users who lock their tokens. - Participatory Steps: Users can earn SPLASH NFTs by contributing to liquidity pools and locking their LP tokens. - Incentive Structures: Balanced approach to governance power distribution based on token lock duration. ## PARKO URL: https://radix.wiki/community/parko Updated: 2026-02-10 Welcome to PARKO This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Radix Foundation URL: https://radix.wiki/ecosystem/radix-foundation Updated: 2026-02-10 Summary: The Radix Foundation is a not-for-profit organization focused on the development, promotion, and governance of the Radix public network, a decentralized ledger The Radix Foundation is a not-for-profit organization focused on the development, promotion, and governance of the Radix public network (https://www.radixfoundation.org/) , a decentralized ledger technology. Established in the United Kingdom, the Foundation aims to further the understanding and adoption of Radix technology (https://learn.radixdlt.com/article/what-is-the-radix-foundation-and-its-group-of-entities) through various initiatives, including education, research, and community support. https://youtu.be/NuwTyHJc9i0 (https://youtu.be/NuwTyHJc9i0) History and Formation The Radix Foundation traces its origins to the development of the Radix distributed ledger technology (https://www.radixdlt.com/blog/rtjl-token-holdings-update) by a team led by Dan Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) and Piers Ridyard. The founders' philosophy (https://learn.radixdlt.com/article/what-is-the-radix-foundation-and-its-group-of-entities) was to decentralize all critical aspects of the network's functionality, with the goal of eventually achieving full decentralization. To facilitate the expedited delivery of the required components while adhering to legal, regulatory, and accounting considerations, the present structure (https://learn.radixdlt.com/article/what-is-the-radix-foundation-and-its-group-of-entities) of the Radix Foundation and its group of entities was established. This structure aimed to facilitate the development and deployment of the Radix network without precluding future decentralization efforts. The Radix Foundation Ltd (https://find-and-update.company-information.service.gov.uk/company/12106715) . was incorporated in the United Kingdom as a not-for-profit holding company limited by guarantee. Its role (https://learn.radixdlt.com/article/what-is-the-radix-foundation-and-its-group-of-entities) is to serve as a representative vehicle for providing governance for the Radix community, with the intention of transitioning to a more decentralized governance model after the launch of the Babylon mainnet upgrade. In parallel, Radix Tokens (Jersey) Ltd (https://www.jerseyfinance.je/business-directory/radix-tokens-jersey-limited/) . was established as a wholly-owned subsidiary of the Radix Foundation, responsible for the issuance and management of the Radix ($XRD) tokens in accordance with the Jersey Financial Services Commission's requirements. This entity was critical in conducting the initial token sale (https://learn.radixdlt.com/article/what-is-the-radix-foundation-and-its-group-of-entities) and facilitating the distribution of tokens to various stakeholders. As the development of the Radix network progressed, additional entities were formed, such as Radix Publishing Ltd (https://find-and-update.company-information.service.gov.uk/company/13434754) ., a subsidiary tasked with publishing the open-source code and managing communication channels, Archetype Ltd., which holds non-open-source intellectual property related to the project, and Exosphere Ltd (https://www.jerseyfsc.org/registry/registry-entities/entity/3018941) ., whose purpose is currently unknown. Organizational Structure The Radix Foundation operates through a group of interrelated entities (https://learn.radixdlt.com/article/what-is-the-radix-foundation-and-its-group-of-entities) , each serving a specific purpose within the broader Radix ecosystem. This structure was designed to facilitate the development and deployment of the Radix network while adhering to legal, regulatory, and accounting requirements. Radix Foundation Ltd (https://find-and-update.company-information.service.gov.uk/company/12106715) Radix Foundation Ltd. is the not-for-profit holding company at the core of the organization. It is incorporated in the United Kingdom as a company limited by guarantee, with no shareholders. The Foundation's articles preclude any director or guarantor from receiving assets or benefiting from them should the company be dissolved. Radix Tokens (Jersey) Ltd (https://www.jerseyfinance.je/business-directory/radix-tokens-jersey-limited/) This entity, wholly owned by the Radix Foundation, is responsible for the issuance and management of the Radix ($XRD) tokens. It operates under the jurisdiction of the Jersey Financial Services Commission (JFSC) and conducts token issuance and related activities in accordance with JFSC requirements. Radix Publishing Ltd (https://find-and-update.company-information.service.gov.uk/company/13434754) Radix Publishing Ltd. is a UK-based subsidiary of the Radix Foundation. Its primary role is to publish the "canonical version" of the Radix open-source ledger code, control the public GitHub repository, and manage all public-facing communication channels attributed to Radix. This entity ensures the transparency and integrity of the open-source codebase and related information dissemination. Archetype Ltd Archetype Ltd., another Jersey-based subsidiary, holds non-open-source intellectual property related to the Radix project. This entity likely manages and protects proprietary technologies or concepts that are not intended for open-source release. The separation of responsibilities across these entities allows for a structured approach to managing different aspects of the Radix ecosystem, such as token issuance, open-source development, and proprietary technology, while adhering to relevant legal and regulatory frameworks. Exosphere Ltd (https://www.jerseyfsc.org/registry/registry-entities/entity/3018941) Previously known as Radix Ecosystem Holding Ltd., Exosphere is a company registered in Jersey. Governance and Management The Radix Foundation and its group of entities are governed by a board of directors responsible for strategic decision-making, oversight, and ensuring compliance with legal and regulatory requirements. Board of Directors As of early 2024, the board of directors for the Radix Foundation Ltd. consisted of Dan Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) , Piers Ridyard, Lindsay Tan Lim (https://www.linkedin.com/in/lindseylim/) , and Lord Godfrey John Cromwell (https://en.wikipedia.org/wiki/Godfrey_Bewicke-Copley,_7th_Baron_Cromwell) . Hughes and Ridyard are the co-founders of the Radix project and significant shareholders in RDX Works Limited (previously Radix DLT Limited), the core developer of the Radix network source code. Recent Appointments In late 2023, the Radix Foundation added two new non-executive directors to its board - Lindsay Tan Lim (https://s3.eu-west-2.amazonaws.com/document-api-images-live.ch.gov.uk/docs/gRoXqFTJlUmNZhw36Egkybf9FIls5hLv23wVYjHIvhE/application-pdf?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=ASIAWRGBDBV3K53IZJCB%2F20240310%2Feu-west-2%2Fs3%2Faws4_request&X-Amz-Date=20240310T133647Z&X-Amz-Expires=60&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEAsaCWV1LXdlc3QtMiJGMEQCIDtneZFkJDRxMdfD6kLLq7fWuA2deennzTI1TBLNkKJtAiBBvYXn73Dh%2BcKzEyRxItdpUSSVmKe5IUK%2BXJkGVcnAhyq7BQgUEAUaDDQ0OTIyOTAzMjgyMiIMyhY9yt2OxoabJE7zKpgFAWYjEwXaMgePmWqhOiWORWZSU6Elu8CqUjRrgmuKOJX3lZlyB5P%2Bwte0OvBUdmZlR4FDBIgRhm40FPoSjjLisVXZV%2FVtgKEIAGf1PkvVNb6QUc04XxJYLjHqZb9ovwVGZd3WchxdjW5t4vyrxIfD28deTG7xYwGEtTQovAskn%2Bxh584kR6xHCOi48oFpwtHz%2FeCtdcwQPmA5ZXs6zO6Agj4D1ld2I%2FZkZFYn%2F6d2jvqFFbmHWNLXxbPVGn17Wdi4C85UdKJQHHEH5F7XjSsjPhA%2BRJVt2Yt5zlp81L0WENvK5n5Wf8gaqeH8qtq%2BFsCP%2B9%2FWwIdiFfs6khDax1dn168%2FQC%2FE92frPeTczBzvrk6hlRxG6K55r%2B9OG9Rv8erj%2BLaJ%2BRFxPtdOqsGjQuRc0IPSAr2nCifedFnC7lcxiTHG3mdNrzGmLxud4b54259ZzSWnWqdaBZlAY8zBKNsugMfgeXwhC3mPFveH2jEyT%2Brei3U9RcizjzKmL850YuOsAtYYpRH2CcLSEa22WEKcuWkBrCsmOJbzA1zZf9E%2B98GDhhY0M0WxrwcCxrySbXoUcufFB8VDls%2Fv51gfayF31Aq0iLQo5RYEeGeBv4yqQdfxQiul4Dbcysy0zpCdk8oc0Z4Fqdyqxi9%2Fz3nMo%2F6NfGX%2F%2Fh0yrYjGsAUTlYG7N7CUdhAMaVCSXakIbMx1qyTBlNQuQMpTB3xIb97pJuWMw6ipbprLSld3POcavmBlJ3WOTNz1cnDCKer3r%2BiPcUBQojlGtja9Hi9KrSG%2FRdWHauSoSWrVW%2Bxsj2uTLpH8HjMJQrlf932mU%2FoFMyFxXgkcuMjPgeglhpV1KyYXwGUKQzZ43%2BRXdqzqLKix8UBDqDkhD4Uw3u1gRTDbm7avBjqyAUcD9u%2FM%2FLxwm1JucBbZVfTaVpK%2BnrC2WRCkE0ljnXU3U%2B0qlB1BVm3B8PmGexxUkeNrM2EdBf%2FbyEpqEtqRJlFEtmaIh69MtxOIaW1HOv7bIOj501ZheLyVIOxGZs0ku5ENL%2BrOk%2Fa4yZQnwnCMO3iwXxDHsC4VU%2BjWSMTZjAegYNQdSA9yPD5H1cgc%2F8Me3HVkFe2PxM1nJTyjJa6kTqO0HmEBKT84GKbl3f70%2FWGoINA%3D&X-Amz-SignedHeaders=host&response-content-disposition=inline%3Bfilename%3D%2212106715_ap01_2024-03-01.pdf%22&X-Amz-Signature=1a7ad09934fabd1bb283cc40b31d2f25956a8221f0068883950faf505e49f60d) and Lord Godfrey John Cromwell (https://s3.eu-west-2.amazonaws.com/document-api-images-live.ch.gov.uk/docs/rgt6dyeQ-7IstCRcfxn3zGgzBpWXzdNUkzvp44DxQFQ/application-pdf?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=ASIAWRGBDBV3GO7UDIXG%2F20240310%2Feu-west-2%2Fs3%2Faws4_request&X-Amz-Date=20240310T133756Z&X-Amz-Expires=60&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEA0aCWV1LXdlc3QtMiJIMEYCIQCjgiPHg%2BWkkLG%2BF0W9sC%2FzCCWdJDdGQfenO0i53gsLNAIhALqmGa1svAw85dIPdXI%2F87hsyA1s75vgIWlOOknxMPX3KrsFCBYQBRoMNDQ5MjI5MDMyODIyIgwX414WsnoAx5vVTy0qmAWO0W971gfyijyZPeF%2FrCPtrsNYAMfyRai%2BUVf9PKc4OSVXGMz%2B5N570Q6oT2vbO0cTZketDvqUuLKRos95IkGmfaHlDTXqfHnS9FdtRgtU2ljF7y4YNA8S8AdP1asaKi69m7s6NIavh200Y03toqkUhxOgut0NShLJenBxabnWUGxGYQDuqBh6OYkxzCAzuJZ8DQKeLFa9yAvQEJcmh4gCLFaGruP%2FRlzpy%2B0SLht30ZEyvKy97cu44ooommZCq2%2FCtD5YI%2FAEM5mmvnHvIq%2F%2BSWteiNJh4GcOf%2Bat7ddqE2QY1ImmCuNMmmTtlLTBPcStvc3%2Frj8EN1P0XbgpXN0m6J62lubM8jR05KadcPxDXL3E4ZMRT9DLt4UesA6mLpd1hmAGeYKwrZKSG6Vh%2B78jlBbZw10wqNc5FVMi57UhXnSp1sK1vhbZKE4usONL%2FaGjAnpQFTHqGxzPFDw3n5%2BJKhJQnIO9LzUg7OpNhUjIZUTt%2FBpIlRbTCJBKqHyjvsoTVl5itMKKJf8g%2F4VscGJiJBFyhQSumcqxbaqxY8c8qcXNKgyZgReQ08PoKcSKiilC8Xz7DiQ4HgZ3Y7hCGfQcPGtwgXjEbLDhs8S%2F8oEnMkpz0cwKs5rZS4iS7O1%2BO4AN%2FQwS1DgQz6LVyrDjY2gjGzC0SoOBYrF43rIqR31TCVKtjEfGawSPT8J4UAA2Kfs%2Bo%2B01OpSR3F10FaxChUn1lHFqXaZyaNwOOh3pdikVcIFVAY%2BksJj8lWRQ9oPEHK4bBTsguK87QgcqndTOObPhQQGbtlcsnroC4nQrwhN00xE%2FBLiqgFpf0CGL2pGSxtJbVxkjBmM0ktfiZBPiTZgS6XE93YofbNCNK4Z%2Fs8FfUTb%2FxCyKAIO2MIHftq8GOrABKtajieDjgvhYTwM4C1utcSbzg98%2BOMjccPm7AFd1DflOK6cCkztcjwnp%2FFkC%2B%2B6X23kQ%2BkA1hvKWiM4lGewzMf1fpYQNn1OLiRva44Z4bmYbB3fBupjiF7ks78Gt1yIaW04Hs2rG8V7IvlBWaEDPfkZ7AI6Db9hTXOLgyRhonUYBUyYZ2D7nIqm0eBkXkDF2d3Eec%2Bh4yc1w77aGOfL2keib%2BA92IdgxbP%2FtYQsbK1s%3D&X-Amz-SignedHeaders=host&response-content-disposition=inline%3Bfilename%3D%2212106715_ap01_2024-02-16.pdf%22&X-Amz-Signature=04402d9c5ba583cb5aa62235365facb9362d830ed1aa21b76910d67b8752af85) . Governance Processes All budgetary actions and major decisions related to Radix Tokens (Jersey) Ltd. require majority approval (https://www.radixdlt.com/blog/rtjl-token-holdings-update) from its board of directors, which includes Andy Jarrett, Lindsay Bracegirdle, James Cunningham-Davis, Piers Ridyard, and Dan Hughes. The Radix Foundation has implemented strict internal governance processes (https://www.radixdlt.com/blog/rtjl-token-holdings-update) to prevent insider trading and ensure the responsible use of funds. Employees and major fund recipients, such as RDX Works Ltd., are required to obtain documented permission from the board before trading any Radix ($XRD) tokens. Dan Hughes indicated that he would not be taking a salary “ for 12 months or until things have turned around significantly (https://t.me/radix_dlt/771795) .” Radix Public Network and Token The Radix Public Network (https://www.radixdlt.com/) is a decentralized ledger technology developed by the Radix Foundation and its associated entities. It is built on a novel consensus algorithm called Cerberus and utilizes a unique asset-oriented architecture. The network's native cryptocurrency, Radix ($XRD), plays a crucial role (https://www.radixfoundation.org/licenses) in powering the ecosystem and incentivizing participation. $XRD is used for paying transaction fees, staking, and facilitating various decentralized applications (dApps) and services built on top of the Radix Public Network. Initial Token Distribution According to the Radix Economics White Paper (https://www.radixdlt.com/blog/rtjl-token-holdings-update) released in October 2020, the initial distribution of $XRD tokens was as follows: - Founder Reserve: 2.4 billion $XRD retained by RDX Works Ltd. (formerly Radix DLT Ltd.) - Stable Coin Reserve: 2.4 billion $XRD indefinitely locked by Radix Tokens (Jersey) Ltd. - Radix Tokens (Jersey) Ltd.: Approximately 3.36 billion $XRD allocated for development, security, subsidies, incentives, and operational expenses. - Token Sale, Community, and Liquidity Incentive allocations made up the remainder. Token Holdings and Management As of February 2024 (https://www.radixdlt.com/blog/rtjl-token-holdings-update) , Radix Tokens (Jersey) Ltd. held approximately 2.21 billion $XRD, representing a significant financial foundation to support the growth and development of the Radix Public Network. These holdings are allocated across various buckets, including development, grants, liquidity programs, and operational expenses. To sustain its initiatives, Radix Tokens (Jersey) Ltd. maintains a robust reserve of assets (https://www.radixdlt.com/blog/rtjl-token-holdings-update) , including fiat currency, stablecoins, and other cryptocurrencies. It works with regulated entities and strategic over-the-counter (OTC) partners to manage the sale of $XRD tokens (https://www.radixdlt.com/blog/rtjl-token-holdings-update) , aiming to minimize market impact and support ecosystem growth. Funding and Expenditure The Radix Foundation and its entities have relied on various funding sources (https://www.radixdlt.com/blog/rtjl-token-holdings-update) to support the development, deployment, and growth of the Radix Public Network. Sources of Funding One of the primary sources of funding for the Radix ecosystem has been the proceeds from the sale of Radix ($XRD) tokens. During the initial token sale in November 2020, Radix Tokens (Jersey) Ltd. received approximately 3.36 billion $XRD tokens (https://www.radixdlt.com/blog/rtjl-token-holdings-update) , intended to cover development costs, security measures, network subsidies, developer incentives, awareness programs, liquidity programs, and other operational expenses. Expenditure Breakdown Over the three-year period leading up to early 2024, Radix Tokens (Jersey) Ltd. approved expenditure amounting to approximately £24 million (https://www.radixdlt.com/blog/rtjl-token-holdings-update) to fuel the growth and development of the Radix Public Network. This funding has been allocated across various areas, including: - Technology development: Covering the creation of core components such as the Radix Engine, Scrypto programming language, wallets, and other critical services. - Marketing and community initiatives: Supporting events like APE NYC, Dev Swarm, RadFi 2022, conference attendance, and social media presence. - Ecosystem support: Providing grants, funding programs, and resources to foster the growth of dApps, developers, and the broader Radix ecosystem. For the period between April 2023 and January 2024, the expenditure breakdown (https://www.radixdlt.com/blog/rtjl-token-holdings-update) was as follows: - 47% - Technology and Product Development - 24% - Marketing and Community - 14% - Operational Expenses - 15% - Ecosystem Support Treasury Management Strategy To responsibly manage its token holdings and fund ongoing operations, Radix Tokens (Jersey) Ltd. works with regulated entities and strategic over-the-counter (OTC) partners. These partners are contractually obligated to implement market impact minimization strategies (https://www.radixdlt.com/blog/rtjl-token-holdings-update) , such as placing orders away from the current market price, to avoid significant price movements. Additionally, Radix Tokens (Jersey) Ltd. pauses all treasury management activities (https://www.radixdlt.com/blog/rtjl-token-holdings-update) during major announcements, milestones, and periodically to ensure market neutrality of the strategies employed. Initiatives and Projects The Radix Foundation and its entities have supported numerous initiatives and projects (https://www.radixdlt.com/blog/rtjl-token-holdings-update) aimed at fostering the growth and adoption of the Radix Public Network and its ecosystem. Key Projects One of the primary focuses has been the development of the core Radix technology stack, including the Radix Engine, the Scrypto programming language, and various wallets and critical services. These components lay the foundation for a thriving decentralized economy on top of the Radix network. Ecosystem Support In addition to technology development, the Radix Foundation has allocated significant resources to support the broader Radix ecosystem. This includes providing grants and funding programs for developers, entrepreneurs, and projects building on the Radix Public Network. For instance, in early 2024, Radix Tokens (Jersey) Ltd. provided a grant of $10 million worth of $XRD tokens towards the launch of Project Ignition (https://www.radixdlt.com/blog/rtjl-token-holdings-update) , a liquidity campaign aimed at accelerating the growth of the Radix ecosystem. Additionally, an allocation of 25 million $XRD (worth over $1 million at the time of announcement) was made available for developers and entrepreneurs building on the network. Community Initiatives The Radix Foundation has also supported various community initiatives (https://www.radixdlt.com/blog/rtjl-token-holdings-update) , such as the Dev Swarm events, which brought together over 250 developers from across Europe for a shared Scrypto learning experience. Other initiatives include attending and hosting conferences, growing the Radix social media presence, and fostering a vibrant community around the project. Licensing and Open Source The Radix Foundation and its entities have embraced open-source development as a key pillar of their approach, promoting transparency, collaboration, and community engagement. Radix Licenses Radix Publishing Ltd., a subsidiary of the Radix Foundation, has created a set of licenses to facilitate the distribution of software and documentation, as well as to encourage regular contributions from individuals and corporations. The primary license used is the Radix License 1.0, which is based on the Apache License 2.0 (https://www.radixfoundation.org/licenses) but with additional provisions to address the complexities of open-source, permissionless distributed ledgers. This license aims to protect the Radix community and ensure the creation of reliable and long-lived software products through collaborative open-source development. Open-Source Development All software and documentation published by Radix Publishing Ltd. or any of its projects are licensed under the Radix License 1.0 (https://www.radixfoundation.org/licenses) , unless explicitly stated otherwise. This approach ensures transparency and promotes community participation in the development of the Radix Public Network and its associated technologies. By embracing open-source development and fostering a collaborative community, the Radix Foundation aims to create a robust and decentralized ecosystem around the Radix Public Network. ## Radix Mainnet (Xi'an) URL: https://radix.wiki/contents/tech/releases/radix-mainnet-xian Updated: 2026-02-10 Xi’an [ /ʃi’æn/ ] will be the next major release (https://learn.radixdlt.com/article/what-is-the-radix-roadmap) of the Radix network. Xi’an will allow an unlimited number of shard groups (/contents/tech/core-concepts/shard-groups) , enabling Radix’s sharded (/contents/tech/core-concepts/sharding) state model to be fully utilized. State Model The Radix 'universe' exists on a fixed space of 2^256 data shards (/contents/tech/core-concepts/sharding) , served by a distributed network of validators called Shard Groups (/contents/tech/core-concepts/shard-groups) . Shard coverage for Xi’an will be allocated according to processing power, storage and bandwidth. A fixed ledger space allows wallet addresses to be deterministically mapped to a shard and, hence, for the global state of the network to be a derived sum of shard states. With so many shards, the likelihood of two substates being mapped to the same hash (a hash collision) at 1m transactions per second (tps), has been estimated as one every 3.67^63 years (https://t.me/radix_dlt/48589) . The Radix protocol assumes that 256 bits are enough to avoid "almost all" collisions in shard addresses. In the rare event of a collision, the corresponding transaction would fail and the user would simply retry, automatically resulting in different shard addresses. A proto-implementation of Xi’an is currently being built in the Cassandra (/contents/tech/research/cassandra) project. Radix’s founder, Dan Hughes (/community/dan-hughes) , has said that the first release of Xi’an might begin with 8-16 shard groups (https://youtu.be/FZWT3j9XHMI) , which will be increased as the network throughput grows. Consensus Radix's consensus protocol is called Cerberus. It will allow an effectively unlimited number of shards to operate independently without losing atomic composability. Validator Nodes in Xi'an It remains unclear if every Xi'an wallet will be given the software to run a validator node. The consensus algorithm's operation is also uncertain when most nodes are powered down. Power drain and feasibility of running mobile validators are additional concerns. Xi'an Proof of Concept The Xi'an protocol for the Radix network is a speculative concept, aimed at increasing the number of validator sets and shard groups for better transaction processing. The number of validator sets may be double the amount required to satisfy the demand for transactions per second (TPS). Each validator set may consist of up to 3000 validators, with a minimum of 100 validators for security purposes. The Xi'an protocol allows the number of validators in each validator set to vary between 100 and 3000 while maintaining the same number of shard groups. Only when TPS demand increases to utilize the whole 3000 validators will it be necessary to add a shard or split existing shard groups. DEVELOPMENT Launch Date 2027 (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) Antecedent Babylon (/contents/tech/releases/radix-mainnet-babylon) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) LEDGER State Model Sharded (/contents/tech/core-concepts/sharding) Shard Groups Unlimited (/contents/tech/core-concepts/shard-groups) Sybil Protection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) Consensus Protocol Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v2 Validator Node Cap Unlimited ## Dan Hughes URL: https://radix.wiki/community/dan-hughes Updated: 2026-02-10 Daniel Patrick Hughes (24 July, 1979 - 27 July, 2025), also known by his forum pseudonym ‘fuserleer’, was a British software engineer, veteran of the gaming industry, founder of Radix and Founder / Director of RDX Works. Early Life and Education “I’m proud to be from Stoke-on-Trent, my roots are here and it provides a peaceful and calm environment to focus in on developing the technology and coding that will ensure DLT is scalable and sustainable.” - Link (https://www.finyear.com/Dan-Hughes-CTO-and-Founder-of-Radix-DLT_a41870.html) Dan Hughes grew up in Stoke-on-Trent in the UK (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . His childhood in the former mining town was marked by ” weekends in working men's clubs, where card-carrying union men would play darts and bingo with lashings of lager (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) ”, while most other boys his age were playing soccer. Hughes' path toward technology began early when his dad, a bus driver, brought home a ZX81 computer when Hughes was about five years old (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . This introduction to computing would prove foundational, as Hughes found an instant attraction to computer programming (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) that set him apart from his peers. He later recalled that "no one throughout my entire school life was interested in coding" (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) , highlighting his unique passion for the field from an early age. This early exposure to computing and programming laid the tracks for Hughes to eventually become a successful mobile developer (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) , setting the stage for his later pioneering work in distributed ledger technology. Career Hughes opted not to go to university and instead worked for several software, gaming and mobile development companies (https://www.linkedin.com/in/dan-hughes-2a6b117/details/experience/) after finishing school. In the early 2000s, Hughes pivoted to freelance work, focusing on mobile SKUs. A couple of years later, he started his own company, KDB Technology, which specialized in Near Field Communication (NFC) technology and contactless payment services. Under Hughes's leadership, KDB Technology provided services for major mobile OEMs and operators such as Nokia, Sony Ericsson, T-Mobile, and Samsung. Before founding Radix, Hughes established himself in the mobile technology sector, where he previously helped build the software behind NFC mobile payments (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . His early career provided him with both technical expertise and financial resources that would later prove crucial during his years of independent research and development. Radix Development (2012-2018) Hughes' journey into distributed ledger technology began when he first heard about Bitcoin in 2011 and a year later finally got around to downloading the 15-page document that became known to blockchain followers as the Satoshi White Paper (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . After studying Nakamoto's architecture, Hughes played around with the code, trying to modify its architecture in a process known as forking (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . He quickly identified fundamental scalability issues, realizing that the more people used bitcoin for transactions, the slower the system would become (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . Determined to solve these problems, Hughes decided to build his own version of Nakamoto's formula (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . In 2012, he moved out of his small home office and took over the dining room of his house, removing the dining table and replacing it with stacks of servers, filing cabinets, white boards, six screens and a mass of cables (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) , much to the anguish of his wife (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . The following six years were marked by intense dedication and personal sacrifice. Hughes worked in his dining room every day, waking up to write code virtually nonstop until around 4 a.m. the next morning (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . He lived off his savings and some investment returns he'd made from some bets in mobile technology (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . The work was so consuming that Hughes experienced moments of severe depression, where he thought blockchain's scaling problem was insurmountable (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . The perfectionist nature of Hughes' approach meant that he threw away months of work at least twice during this six-year period (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . One particularly difficult moment came after 18 months of work on one iteration, when he realized he needed to start from scratch again (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . The financial pressure eventually became so intense that Hughes and his wife sold their four-bedroom house and downsized to a smaller, two-bedroom home (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . Radix DLT Foundation and Leadership In early 2017, Hughes achieved a breakthrough with what would become the fourth iteration of what he had started working on in 2012, containing only about 10% of any of the code he wrote over the last six years (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . That year, he moved into a new office and it took about half a year to readjust from regular night-owling (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . Hughes' work began attracting significant attention and investment. He got into Y Combinator, a prestigious Silicon Valley program for startup founders, in 2017 (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) , though he dismisses that part of his story as a fluke (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . More importantly, he attracted $1 million in investment from a leading European venture capitalist (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) , specifically from Saul Klein, who runs London venture capital firm LocalGlobe (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . Klein compared his experience meeting Hughes to when he met the guys in Estonia when they developed Skype (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . By 2019, Hughes had assembled an international team, with eight engineers who had recently flown in to Stoke from Argentina, Australia and elsewhere to work with him, along with ten more staff in London (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . The project had also gained significant community following, with more than 19,000 people following his work on the messaging app Telegram (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . In October 2024, Hughes announced a major organizational restructuring. He stepped away from RDX Works to focus solely on the Radix Foundation and Radix Labs, a new subsidiary focused on research into Cerberus & Xi'an (https://www.radixdlt.com/blog/its-personal-time-to-restructure-get-things-back-on-track) . This change was part of addressing the conflict of interest between RDX Works and the Radix Foundation (https://www.radixdlt.com/blog/its-personal-time-to-restructure-get-things-back-on-track) , allowing each entity to focus on distinct objectives. The organizational changes continued in February 2025, when Hughes announced the decision to end ongoing development work with RDX Works (https://www.radixdlt.com/blog/foundation-update-a-fork-in-the-road) and move development in-house at the Radix Foundation. This restructuring was designed to provide a leaner, more efficient approach (https://www.radixdlt.com/blog/foundation-update-a-fork-in-the-road) and allow for complete control over the development path of features. Hughes' career came to an unexpected end when he passed away unexpectedly from natural causes at his home on Sunday night (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) in July 2025, just as he was leading the Hyperscale test and reinvigorating public dialogue about Radix's future. Position Company Business Start Date Finish Date Senior Developer Software Creations (https://find-and-update.company-information.service.gov.uk/company/03047976/officers) Console games. November, 1996 April, 1998 Director of Technology Camel 28 PC & console games. April, 1998 October, 2000 Senior Developer, Architect & Team Leader Barcrest (https://find-and-update.company-information.service.gov.uk/company/00937830) Video gambling machines. October, 2000 July, 2003 Freelance Consultancy, Development & Team Management CSP Mobile GMBH (https://www.csp-sw.com) Mobile development. August, 2003 April, 2005 Owner KDB Technology (https://find-and-update.company-information.service.gov.uk/company/06512975) Mobile technology. July, 2005 November, 2012 Founder / CTO Radix DLT (Unincorporated) Radix core development November, 2012 September, 2017 Co-Founder / CTO Surematics (https://find-and-update.company-information.service.gov.uk/company/10508397) Insurance tooling. November, 2016 - Founder / CTO RDX Works (https://www.notion.so/RDX-Works-8092803705114cd1b34bbef5dd98d7af?pvs=21) Radix core development September, 2017 - Technical Contributions Cerberus Consensus Protocol Main article: Cerberus (Consensus Protocol) (https://www.notion.so/Cerberus-Consensus-Protocol-d5755bce114d42e781128ee6a5603dcd?pvs=21) Hughes' most significant technical achievement was the development of Cerberus, a novel cross-shard consensus protocol that addressed fundamental scalability limitations in existing blockchain systems. The academic paper evaluating Cerberus, the novel cross-shard consensus protocol powering Radix, was accepted to JSys, the Journal of Systems Research, after completing a rigorous peer review process (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . This peer review involved independent experts verifying the proof, theory, and soundness of Cerberus (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . The academic evaluation was conducted in partnership with Professor Mohammad Sadoghi and the University of California Davis (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) , with Professor Sadoghi noting their four-year collaboration in exploring the frontier of resilient consensus (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . The research paper demonstrated that Cerberus is the most performant and efficient consensus protocol that supports atomic updates across shards (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) when compared with other notable consensus protocols. The comparative analysis included evaluation against Chainspace, which was acquired for Facebook's Libra DLT solution, AHL, Sharper, and Ring BFT (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . Testing results showed that all three flavors of Cerberus clearly came out on top under all scenarios, with linear scalability clearly evident, and tests showing that Cerberus comfortably exceeded 1 million transactions per second under multiple scenarios (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . The research concluded that Cerberus is the most efficient consensus protocol, has the best throughput, and the lowest latency when compared to other state of the art multi-shard consensus protocols (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . Radix Engine and Architecture Main article: Radix Engine (https://www.notion.so/Radix-Engine-1cd101cd52264df9806d63da555d5828?pvs=21) Hughes designed the Radix Engine as a comprehensive smart contract execution environment that was already implemented in Babylon and built with sharding in mind (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) . The architecture represented a fundamental departure from traditional blockchain design, utilizing what Hughes described as a pre-sharded architecture with a massive state space of 2^256, providing the foundation for Radix's linear scalability approach (https://www.radixdlt.com/blog/hyperscale-alpha---part-i-the-inception-of-a-hybrid-consensus-mechanism) . The system's innovation lay in its approach to state management, where the thing that lives in a shard wasn't put there by a "user" but was the result of a hash function, making it essentially like picking a random number between 2^256 (https://www.radixdlt.com/blog/technical-ama-with-founder-dan-hughes-16th-february-2021) . This design meant that the chances of 2 things ending up in the same shard, over a reasonably measurable period of time are practically zero (https://www.radixdlt.com/blog/technical-ama-with-founder-dan-hughes-16th-february-2021) , effectively solving congestion issues that plagued other systems. Hughes emphasized that the Radix Engine maintained commercial grade integrity, having received the only 10/10 audit score from Hacken (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) , and any modifications for sharding would be performed in a manner to preserve that commercial integrity (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) . Research Networks Main article: Cassandra (https://www.notion.so/Cassandra-fdf402d1427e4a42a366b74ae40b6c86?pvs=21) Hughes developed several research platforms to advance consensus mechanism design, most notably the Cassandra research network. Cassandra has demonstrated extremely high throughput, low finality, and sustained performance for days on end—proving that true scalability is achievable (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) . However, Hughes was clear that while Cassandra performs all mandatory functionality to demonstrate scale, it was not implemented in a way that could be considered commercial grade (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) . The research culminated in Hyperscale Alpha, which Hughes described as a consensus protocol that uses both a proof-of-work variant and proof-of-stake as sybil resistance mechanisms (https://www.radixdlt.com/blog/hyperscale-alpha---part-ii-design-principles) . This hybrid approach was designed to achieve weak liveness guarantees and deterministic safety through its proof-of-work sybil resistance mechanism and longest chain rule (https://www.radixdlt.com/blog/hyperscale-alpha---part-ii-design-principles) , while incorporating the strong deterministic safety guarantees of Byzantine Fault Tolerant (BFT) protocols (https://www.radixdlt.com/blog/hyperscale-alpha---part-ii-design-principles) . Hughes' final major technical initiative was the Hyperscale Alpha test, which aimed to demonstrate true systemic scalability, targeting a sustained 1 million complex transactions per second using a globally distributed validator set involving community participation (https://www.radixdlt.com/blog/radix-labs---hyperscale-alpha-test) . This test was designed to be as close to a real-world performance test as possible, with nodes operated by people around the world, handling complex transactions under intentionally imperfect conditions, with no gimmicks and no disabled essential features (https://www.radixdlt.com/blog/radix-labs---hyperscale-alpha-test) . The technical innovations Hughes developed were intended to support Xi'an, Radix's next big leap, which would take the system from proof-of-concept to a commercial-grade system ready to scale globally (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) , with an ambitious timeline targeting an Alpha release in early 2027, followed by a Beta later that year, with a full launch slated for the second half of 2027 (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) . Public Recognition Media Coverage Hughes gained significant media attention in January 2019 when Forbes published a major profile titled "This Reclusive Engineer Is Plotting The Death Of Blockchain" (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) by technology journalist Parmy Olson. The article positioned Hughes as one of the hardest things an engineer can do is throw away months of work, but in the six years that Dan Hughes spent building a new alternative to blockchain in self-imposed isolation, he did it at least twice (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . The Forbes piece highlighted Hughes' unconventional approach to solving blockchain's fundamental problems, describing how few tech startups base themselves in this former mining haunt, with the biggest corporate inhabitants Vodafone and Bet365, an online gambling firm (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . Yet the article emphasized that the silence here is what's important, as it gave Hughes the space to think up highly complex computations with minimal distraction (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . The profile drew parallels between Hughes and other successful technology innovators, noting that engineers can be an isolated bunch, so it's no surprise that some of the most successful technology services have stealth beginnings, with the founders of WhatsApp refusing to consort with other tech founders or attend conferences before they were suddenly bought by Facebook for $19 billion, and Skype having similarly isolated beginnings in Estonia (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . Academic and Industry Recognition Hughes achieved significant academic recognition when the academic paper evaluating Cerberus was accepted to JSys, the Journal of Systems Research, after completing a rigorous peer review process (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . This achievement meant that Radix now joins a small family of public ledgers that have had their consensus protocol tested to the highest academic standards (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . The academic validation came through collaboration with Professor Mohammad Sadoghi and the University of California Davis (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) , who provided support in the analysis and testing of the various protocols contained within the paper and during the peer review process (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . Professor Sadoghi acknowledged their partnership, stating that in partnership with Radix over the past four years, they had been exploring the frontier of resilient consensus, with patience and perseverance overcoming the skepticism and technical barriers in this uncharted territory (https://www.radixdlt.com/blog/cerberus-consensus-peer-reviewed) . Community Engagement and Technical Leadership Hughes became known for his direct engagement with the technical community through regular educational sessions. He hosted technical ask-me-anything (AMA) sessions every two weeks on the main Radix Telegram Channel (https://www.radixdlt.com/blog/radix-technical-ama-with-founder-dan-hughes-16th-march-2021) , where community members could ask Dan questions on tech developments, project milestones, or anything else related to the cutting-edge innovations of Radix (https://www.radixdlt.com/blog/radix-technical-ama-with-founder-dan-hughes-16th-march-2021) . These sessions covered complex technical topics and demonstrated Hughes' ability to explain sophisticated concepts to diverse audiences. The AMAs typically featured great discussions around Cerberus, Radix's next-generation consensus mechanism, general design approaches to the network, and general industry questions (https://www.radixdlt.com/blog/radix-technical-ama-with-founder-dan-hughes-2nd-february-2021) . By 2019, Hughes had built a substantial following, with more than 19,000 people following his work on the messaging app Telegram, a popular platform for blockchain enthusiasts (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) . His approach to community engagement was notably authentic, as he wasn't Silicon Valley polished, and never pretended to be, which was part of what made him so compelling (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . Hughes' reputation in the industry was built on his technical rigor and direct communication style. He gave everyone the time of day, treated people with respect, and believed in fair, honest dialogue, whether you were a new developer, a community member, or an industry veteran (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . This authenticity, humility, and clarity of thought drew so many people to him, and to the vision he carried (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . Death and Legacy In his final months, Hughes had taken on an increasingly active public role in advancing Radix's development. He was leading the Hyperscale test, reinvigorating public dialogue, and deeply engaged in driving Radix forward (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) , with his energy and optimism being contagious, and his belief in what Radix could achieve remaining stronger than ever (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . Hughes passed away unexpectedly from natural causes at his home (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) on July 27, 2025. The announcement, made by the Radix team, expressed profound sadness (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) at the loss of someone they described as a visionary, a builder, and a relentless problem-solver (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . The team noted that Hughes wasn't driven by recognition or hype, but by a deep conviction that better systems could, and must, be built (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . Impact and Organizational Continuity Hughes' death was recognized as a huge loss, for all who worked alongside him, for the Radix community, and the broader decentralised technology space (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) , with his absence being felt profoundly (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . However, the Radix team emphasized that Dan's work and vision live on, with the foundations he built being robust, battle-tested, and already in the hands of thousands of developers and users (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . The organization acknowledged that the sudden loss of a founder inevitably brings disruption, particularly when that founder has been so central to both the technology and the vision (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . To ensure continuity, Adam Simmons (Chief Strategy Officer) and Jonathan Day (Finance Director) joined Andy Jarrett (CEO) as Directors of the Radix Foundation (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . Jonathan Day was described as someone who has been with the Foundation since 2023 and is a co-founder of Blockchain Jersey, a business which works closely with government and regulators, has been involved in crypto since 2011 and has worked with a number of firms including CoinShares and Binance (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . Technical and Philosophical Legacy Hughes' technical contributions established a foundation for continued development in distributed ledger technology. The team committed that the roadmap he shaped remains in motion, and the team he assembled shares his values and is committed to delivering on the vision Dan started (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . The organization stated that the best way to honour his memory is to continue building with the same rigour, passion, integrity, and commitment that defined him (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . Hughes was remembered for his authentic approach to both technology and human relationships. The memorial statement noted that he ” wasn't Silicon Valley polished, and never pretended to be, which was part of what made him so compelling (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) .” His leadership style was characterized by accessibility and respect, as he gave everyone the time of day, treated people with respect, and believed in fair, honest dialogue, whether you were a new developer, a community member, or an industry veteran (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) . The Radix team concluded their memorial with a commitment to continuity: ” together, we will ensure his legacy not only endures, but becomes what he always believed it could be (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) .” They emphasized that while ” Radix has always been more than one person, Dan was at its core (https://www.radixdlt.com/blog/in-memory-of-dan-hughes-founder-of-radix) ”, highlighting both his irreplaceable contribution and the institutional strength he had helped build to carry forward his vision. Controversies Dan Hughes as Satoshi Nakamoto A post (https://radnode.io/blog/is-the-real-satoshi-nakamoto-dan-hughes-of-radix) on Radnode (https://www.notion.so/Radnode-f20371f5d2404d49a7483625c4e7d210?pvs=21) 's blog presents a theory linking Hughes to the pseudonymous creator of Bitcoin, Satoshi Nakamoto. The theory hinges on various elements such as Hughes's interest in racing (particularly Nissan GTRs), his geographical location in the United Kingdom (potentially matching Satoshi's alleged time zone), and the similarities in their technical backgrounds and writing styles. While the theory is speculative and humorous in nature, it underscores the depth of Hughes's contributions to the crypto space. Blackhat Controversy “Way back in 2010ish I helped a friend start an ad network as I was in between previous company and crypto it was promoted on various forums and I ran point on some of them for him if you know anything about online advertising you'll know that fraudulent clicks are a daily thing and he was getting a lot of them A particularly big batch was from a user who had an account on blackhat. Payment was refused and he kicked up a shit storm on there,. And that's it, nothing more exciting than that.” - Dan Hughes, Telegram (https://t.me/delphibets/145020) , 12/12/2023 Articles by Dan Hughes https://cointelegraph.com/news/the-fallacy-of-scalability (https://cointelegraph.com/news/the-fallacy-of-scalability) Hughes in 2019 ## wingspan URL: https://radix.wiki/community/wingspan Updated: 2026-02-10 Welcome to wingspan This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Quack Space URL: https://radix.wiki/community/quack-space Updated: 2026-02-10 Welcome to Quack Space This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Kangaderoo URL: https://radix.wiki/community/kangaderoo Updated: 2026-02-10 Welcome to Kangaderoo This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## VandyILL URL: https://radix.wiki/community/vandyill Updated: 2026-02-10 Welcome to VandyILL This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Vander Trader URL: https://radix.wiki/community/vander-trader Updated: 2026-02-10 Welcome to Vander Trader This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Black Fox URL: https://radix.wiki/community/black-fox Updated: 2026-02-09 Welcome to Black Fox This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Memerad URL: https://radix.wiki/community/memerad Updated: 2026-02-09 Welcome to Memerad This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## ArtQ.xrd URL: https://radix.wiki/community/artq-xrd Updated: 2026-02-09 Welcome to ArtQ.xrd This is your personal community page. Edit it to share your thoughts, projects, and contributions with the RADIX community. About Me Tell the community about yourself... ## Blockchain Trilemma URL: https://radix.wiki/contents/tech/core-concepts/blockchain-trilemma Updated: 2026-02-08 The Blockchain Trilemma refers to the assumed mutual exclusivity of digital ledger security, scalability, and decentralization. Overview The trilemma arises because it's currently very challenging to maximize all three of these attributes simultaneously. For instance, increasing scalability often comes at the cost of decentralization, as larger block sizes or faster block times can lead to a greater concentration of mining power. Similarly, improving security can require sacrificing scalability or decentralization. Security This attribute refers to the ability of the blockchain to resist attacks such as a 51% attack, where a single entity gains control of more than half of the network's mining hash rate, potentially allowing them to disrupt the network by double-spending coins. Scalability This refers to the capacity of the network to handle a large number of transactions per second. Many early blockchains, like Bitcoin, have struggled with scalability issues, which can lead to slow transaction times and high fees. Decentralization Decentralization is the degree to which control and decision-making within the network is distributed among participants, rather than being concentrated in a single central authority. High levels of decentralization are often seen as desirable because they can make the network more resistant to censorship and manipulation. ## Atomic Composability URL: https://radix.wiki/contents/tech/core-concepts/atomic-composability Updated: 2026-02-08 Atomic Composability [ /əˈtɑmɪk kəmˌpoʊzəˈbɪlɪti/ ], sometimes erroneously called Synchronous Composability, is the ability to execute multiple operations across separate applications in a single transaction, without the risk of a partial failure, such that complex, multi-party transactions are either executed successfully or rolled back to their original state without data loss or inconsistency. Composability Composability is the ability to recombine components within a system into larger structures and for the output of one to be the input of another. It is a sub-class of interoperability but confined to components within a system rather than between systems. The concept of composability can be divided into high level ‘syntactic composability’ and low level ‘morphological composability’. - Syntactic Composability can be likened to the grammar of a system and is the design principle that enables Lego pieces to fit together, or smart contracts on a decentralized ledger to permissionlessly call each other’s methods. - Morphological Composability ensures that the internal structures of components within a system, such as functions and interfaces are compatible with those of other components. An example of this is the ERC20 token standard on Ethereum, which ensures that tokens are formatted in the same way across smart contracts. Distinctions - Modularity is a prerequisite for composability but is a distinct concept. Modularity is the design principle that divides a system into smaller parts or modules, but it does not prescribe the relationships between them. - Integration is the process of orchestrating multiple components into a cohesive entity that is greater than the sum of its parts. An example would be combining words into a novel or separate business functions into a corporation. Atomicity Atomicity is one of the four ‘ACID’ attributes of reliable database transactions, along with Consistency, Isolation and Durability. In this context, atomicity means that all transactions are treated as a single, indivisible unit, regardless of how many operations they encompass. This implies two things: - All or Nothing: If all operations in a transaction complete successfully, the transaction is considered committed. If any of the operations fail, the entire transaction is rolled back to its previous state, ensuring that the database remains unchanged. - Undivisible Operations: Once a transaction is committed, it appears as a single operation, meaning there's no way to identify the sequence of operations that took place during that transaction from the perspective of other concurrent transactions. Atomicity in this context relates to both syntactic and morphological qualities of the term. Importance in Decentralized Systems Atomic composability is deemed crucial (https://medium.com/@radixdlt/breakthrough-in-consensus-theory-scaling-defi-without-breaking-composability-13c777ff5200) to decentralized finance (DeFi) because it allows for operations such as flash loans, where an asset is borrowed, invested, and paid back within a single transaction. Research by RDX Works has found that 69% of all Ethereum transactions utilize some degree of atomic composability. (https://twitter.com/PiersRidyard/status/1583013758620008449) In the context of decentralized systems, atomic composability is important for the following reasons: - Consistency: Ensures that the decentralized system maintains a consistent state, especially when dealing with complex, interrelated transactions. - Trust: Users of the system can be confident that their transactions will either complete in their entirety or fail without partial effects. - Efficiency: Reduces the need for manual intervention or complicated recovery mechanisms to handle partial failures. Atomic Composability in Radix Radix has prioritized atomic composability by embracing three principles (https://uploads-ssl.webflow.com/6053f7fca5bf627283b582c2/61d5a4583aad156a094c5628_Radix%20DeFi%20White%20Paper%20v2.05.pdf) : - Support an almost infinite number of shards (/contents/tech/core-concepts/sharding) for maximum parallelism. - Develop a consensus protocol capable of dynamically handling atomic transactions across necessary shards without causing network delays. - Design an application layer efficient enough to make use of this enormous shard space and consensus mechanism. These design choices have been implemented in the following ways: Pre-Sharded State Model Radix’s consensus protocol, Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , is designed around the concept of "pre-sharding." It differs from conventional sharding methods by beginning with the partitioning of the ledger into an extensive "shard space." The algorithm allows for a flexible consensus across any number of shards. Partial Ordering Traditional DLTs operate on global ordering, placing all transactions on one timeline. Cerberus enhances this concept by letting each transaction specify which shards are relevant to it. Braiding BFT-Style Consensus Radix employs a new form of consensus called “braiding”. This method involves intertwining various instances (shards) in a transaction. This combination creates a synchronized, reliable multi-shard transaction. dApps and Smart Contracts Scaling For optimal scalability, understanding the relationship between the consensus layer and the application layer is essential. The application layer sets the network's capabilities. Radix Engine determines the substates and relevant substates in a given transaction. Radix Engine Design Choices - Global Object Tokens: Radix considers tokens as global objects at the platform level, optimizing the movement of assets. - Intent-Based Transactions: Transactions on Radix specify intentions rather than explicit actions. For example, rather than specifying particular tokens for a trade, the intention of exchanging a quantity is indicated. - Single Shard Smart Contracts: Each smart contract (or “component”) on Radix is allocated to a single shard. This ensures optimized, uninterrupted functioning. Unlimited dApp Throughput Radix's design aims to offer unlimited throughput for an ecosystem of DeFi dApps. Its system contrasts with Ethereum's, where dApps have limited throughput due to a slow global consensus process. Conversely, Radix's combination of the Radix Engine and Cerberus achieves high parallelism, ensuring optimized transactions and maximum efficiency. Challenges and Criticisms Atomic composability, while powerful, is not without challenges: - Performance Overhead: Ensuring atomicity, especially in a distributed setting, can introduce performance overhead due to the need for synchronization mechanisms. - Complexity: Implementing atomic composability might complicate system design or make certain transaction types more challenging to execute. - Scalability Concerns: As systems scale, maintaining atomic composability across a growing number of nodes or transactions can pose scalability challenges. Critics have also pointed out that while atomic composability is a valuable feature, it may not always be necessary for all types of transactions, especially simpler ones. Balancing the needs for atomicity with system performance and simplicity is a key consideration. ## Building Radix's Developer Pipeline: Nine Events and Counting! URL: https://radix.wiki/blog/building-radixs-developer-pipeline-nine-events-and-counting Updated: 2026-02-08 Building Radix's Developer Pipeline: Nine Events and Counting 4 min read How university workshops can be the foundation for a self-sustaining Radix education movement. Since the Radix Wiki Hackathon (/contents/history/radix-wiki-hackathon-1) back in April 2024, our thesis has been that developers are the primary customers of distributed ledgers. The best time to onboard a developer is at the start of their career, before they've accumulated sunk costs in another ecosystem's tokens and tooling. If Radix could build the most effective onboarding program for new developers, it would secure a compounding strategic advantage. The Scrypto 101 (/contents/resources/radix-developer-resources) and Step by Step (/contents/resources/radix-developer-resources) courses already existed, but they weren't reaching junior developers in sufficient numbers so we partnered with the Radix Foundation to put these tools directly into the hands of computer science students at UK universities. Stage 1: 2024 Stage 1 (/blog/dappinaday-stage-1-complete) was a humbling proof-of-concept: three workshops at Westminster (/contents/history/dapp-in-a-day-workshop-1) , St Mary's (/contents/history/dapp-in-a-day-workshop-2) , and Roehampton (/contents/history/dapp-in-a-day-workshop-3-roehampton) Universities in late 2024. The documentation had never been properly stress-tested at scale, and a hundred students with different operating systems and setups quickly exposed it as out of date and confusing. The students were also far more junior than anticipated - many had never heard of Web3, let alone smart contracts. This was a huge surprise. At Westminster, with a mentor-to-student ratio of 1:15, we managed about a 7% Stokenet deployment rate. At St Mary's, with a smaller cohort and a 1:5 ratio (and Dan Hughes in the room 💙), that jumped to 33%. At Roehampton, where 70 students showed up and a surprise dependency update threw our installations into chaos, we only scraped a single deployment. However, this was never about getting things perfect on the first try. Radix’s tech stack is far superior to other chains (/blog/radix-is-what-web3-noobs-think-they-bought) , but there are certainly still things we can learn from them around building a productive developer ecosystem: The workshops had surfaced problems that would otherwise still be silently hampering Radix adoption, and the RDX Works team immediately started working on a one-click installation solution. Stage 2: 2024/25 Stage 2 ran through 2025 with a more realistic understanding of what we were dealing with. Brunel's Blockchain Society invited us for " Brilliant On Chain (/contents/history/dapp-in-a-day-workshop-4) " in December 2024, where 58 attendees worked through the workshop in partnership with SuperteamUK, sharing the evening with a panel featuring experts from Radix, Solana, and Stellar. We returned to St Mary's (/contents/history/dapp-in-a-day-workshop-5) ESports Arena in February with Shardspace (/ecosystem/shardspace) as sponsor, running a tighter session with 11 participants. Then came Brunel Hack 25 (/contents/history/brunel-hack-25) in July - a two-day hackathon that drew ~200 people and produced three genuine project submissions, including Streamflow, (https://github.com/Sahid-m/radix-hack) a decentralized tipping platform for content creators. In September, again sponsored by Shardspace, we partnered with Crystalisr and LSBU for Hack The System (/contents/history/hack-the-system) in Croydon, a community wealth building event exploring how Web3 could serve South London's creative economy. By November, Roehampton (/contents/history/dapp-in-a-day-workshop-7-roehampton) had invited us back for their Career Development Week, where 35 students worked their way through the curriculum and two more deployed to testnet. The numbers only tell part of the story: nine events, well over 300 attendees, and around 220 component deployments to Stokenet. Brunel Hack alone accounted for more than 200 of those deployments, demonstrating what becomes possible when the tooling works and participants have sufficient time. But the real value lies in the infrastructure being built around these events. There’s an almost insatiable appetite for these kinds of events. Universities are now approaching us for repeat bookings. Students who attended early workshops returned as mentors. Blockchain societies have formed at Westminster and St Mary's. We've been invited to contribute to academic modules, careers fairs, and industrial liaison meetings. Contacts from St Mary's want to help roll out workshops across India. Each event spawns more opportunities than we can manage. Stage 3: Radix Global Building in a desert is hard work - lonely, uncertain, and easy to dismiss. But in the end you get Vegas or Dubai, and people fly in from all over the world to see what you've made. That's the bet we're making with Radix. The ecosystem is still relatively sparse compared to Ethereum or Solana, and convincing students to invest their time in a platform without the network effects of incumbents requires a leap of faith on both sides. But every developer who deploys their first Scrypto component, shares an NFT with a friend, and experiences the absence of gas wars and failed transactions becomes an early inhabitant of a city that doesn't exist yet. When Radix hits critical mass, they'll already know their way around. The vision remains what it was in the original proposal: to build something like an ETH Global for Radix, where events become self-sustaining through sponsor funding rather than grants. The path there runs through continued refinement - containerizing dependencies into genuine one-click solutions, establishing alumni pipelines, securing tiered sponsorship packages, and developing repeatable playbooks that work with any venue. Polygon hosted 300-400 hackathons across India in its early stages and now has a market cap of $3bn with 2000+ DApps - far more than Ethereum relative to its size. The difference between Radix and competitors won't be made in the quality of the technology alone; it'll be made in whether the next generation of developers encounter Radix first, find it a delight to build with, and never look back. RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) ## Hack The System - London South Bank University URL: https://radix.wiki/contents/history/hack-the-system Updated: 2026-02-08 Summary: Hack the System is a community wealth building movement, which held its inaugural event on the 19-20 September, 2025, and aims to build W Hack the System is a community wealth building (https://en.wikipedia.org/wiki/Community_wealth_building) movement, which held its inaugural event on the 19-20 September, 2025, and aims to build Web3-powered marketplaces for South London. Details DATE: 19-20 September, 2025 TIMES: Friday: 17:00 - 20:00; Saturday: 9:00 - 16:00 LOCATION: Electric House, London South Bank University, 3 Wellesley Rd, Croydon CR0 2AG MENTORS: Avaunt (https://x.com/a_vaunt) (ShardSpace) REGISTRATION: Eventbrite (https://www.eventbrite.co.uk/e/hack-the-system-19-21-sep-tickets-1598556117379?aff=oddtdtcreator) https://maps.app.goo.gl/BjyVKG7FMNnEKFHz6 (https://maps.app.goo.gl/BjyVKG7FMNnEKFHz6) Media https://x.com/RadixWiki/status/1960002331761635825 (https://x.com/RadixWiki/status/1960002331761635825) https://x.com/RadixWiki/status/1967928507641557232 (https://x.com/RadixWiki/status/1967928507641557232) https://x.com/radixdlt/status/1968194817936200055 (https://x.com/radixdlt/status/1968194817936200055) https://x.com/RegenCollectiv3/status/1969022151358394485 (https://x.com/RegenCollectiv3/status/1969022151358394485) https://x.com/RadixWiki/status/1971591013572517946 (https://x.com/RadixWiki/status/1971591013572517946) https://x.com/crystalisr/status/1971818967896674722 (https://x.com/crystalisr/status/1971818967896674722) #hackthesystem #web3 #communitywealth #collaboration | Philip Porter (https://www.linkedin.com/posts/philip-porter-462b8929_hackthesystem-web3-communitywealth-activity-7367110832511455232-B-HA) #web3 #blockchain #communitywealth #innovation | Imogene Garthwaite (https://www.linkedin.com/posts/imogene-garthwaite-91293419a_web3-blockchain-communitywealth-ugcPost-7375480541313126400-5L4W) Last weekend, David Randall hosted Hack the System UK, bringing together leading voices to co-create the first blueprint for a South London community economy powered by tokenization, staking, and… | Michael Hernandez (https://www.linkedin.com/posts/michael-hernandez-363484223_last-weekend-david-randall-hosted-hack-the-activity-7377704353773666304-Tw8p) #communitywealthbuilding #radix #hackthesystem #hackthesystem | David Randall (https://www.linkedin.com/posts/davidchrandall_web3-blockchain-communitywealth-activity-7375659669714907136-1jvE/) What happens when a city decides that its cultural scene is worth saving? - Monocle | David Randall (https://www.linkedin.com/posts/davidchrandall_what-happens-when-a-city-decides-that-its-activity-7382369745125408768-wZ8B) Participants Gallery History HackTheSystem was conceived in 2022 by a coalition of organizations including Crystalisr (https://www.crystalisr.coop) , Regens Unite (https://www.regensunite.earth) , and Blockchain Radicals, alongside academic partners like London South Bank University (https://www.lsbu.ac.uk) . The initiative emerged from converging interests in regenerative economics, smart-city development, decentralized arts, and participatory democracy movements. Initially conceptualized as a single event, the first HackTheSystem gathering was postponed as organizers refined their approach and built stronger community relationships. After a three-year development period, HackTheSystem was relaunched in 2025 with a more focused methodology and clearer objectives. The initiative's evolution reflected growing interest in using blockchain and distributed ledger technologies, particularly the Radix ledger, to create community-owned cooperative structures modeled after established mutual organizations. Purpose & Vision To develop multiple Web3 platforms that connect local artists, creatives, and service providers with buyers in the South London Partnership area (1.2 million people), using blockchain technology to facilitate transactions and build community wealth. Through exhibitor showcases and self-organizing teams, create diverse local supply chains where Hipsters (cultural visionaries), Hackers (technical builders), and Hustlers (business developers) collaborate to build innovative solutions. Teams will form organically around exhibitor platforms - for example, an art sales platform might attract artists, promoters, developers, and art buyers who will work together to enhance the platform according to their collective requirements. Each team will compete to build the best community wealth building solution, with prizes awarded for the most innovative and viable platforms that keep wealth circulating within the South London community. Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a "Submission"). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. PHOTOGRAPHY AND VIDEO CONSENT By participating in this Workshop, Entrants acknowledge and consent to the Workshop Organizer and its authorized representatives taking photographs, videos, audio recordings, or other media ("Media") of Entrants during the Workshop. Entrants grant the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create derivative works from, and publicly display such Media for promotional, marketing, educational, documentation, or other business purposes in any medium now known or hereafter developed, including but not limited to websites, social media, publications, and presentations. ## Brunel Hack 25 - Brunel University URL: https://radix.wiki/contents/history/brunel-hack-25 Updated: 2026-02-08 Summary: Brunel Hack 25 was a 2-day hackathon on the 12-13th of July, 2025, organized by the Brunel Society of Blockchain and held at the university campus in Uxbridge. Brunel Hack 25 was a 2-day hackathon on the 12-13th of July, 2025, organized by the Brunel Society of Blockchain and held at the university campus in Uxbridge. https://youtu.be/kRQzyZQoy64 (https://youtu.be/kRQzyZQoy64) Details TELEGRAM: https://t.me/RadixHackathon (https://t.me/RadixHackathon) DATE: 12-13th July, 2025 TIME: 10:00 - 22:00 LOCATION: Michael Sterling Building, Brunel University, Uxbridge UB8 3PH MENTORS: Abdur Razzak (https://www.linkedin.com/in/-abdur-razzak/) REGISTRATION: Eventbrite (https://www.eventbrite.com/e/brunel-hack-25-blockchain-hackathon-festival-10k-usd-in-prize-tickets-1380033640769) https://maps.app.goo.gl/Pt7UGYKrgopNJMiQ8 (https://maps.app.goo.gl/Pt7UGYKrgopNJMiQ8) Media https://x.com/RadixWiki/status/1932339092424941990 (https://x.com/RadixWiki/status/1932339092424941990) https://x.com/Caymann22/status/1944516033156653290 (https://x.com/Caymann22/status/1944516033156653290) https://x.com/brunelonchain/status/1945446761884946700 (https://x.com/brunelonchain/status/1945446761884946700) https://x.com/RadixWiki/status/1957154692712276179 (https://x.com/RadixWiki/status/1957154692712276179) Gallery Winners StreamFlow https://github.com/Sahid-m/radix-hack (https://github.com/Sahid-m/radix-hack) Streamflow is a decentralized tipping platform designed for content creators and streamers. The platform aims to provide an alternative to traditional streaming services like YouTube and Twitch by eliminating commission fees typically charged by these platforms when viewers send tips to their favorite creators. Technology and Features Streamflow operates using blockchain technology and smart contracts. The platform's key innovation is its batch processing system, which uses Radix subintents to collect multiple tips together before sending them to creators. This batching mechanism is designed to reduce transaction fees (gas fees) by processing multiple payments simultaneously rather than individually. Users connect their cryptocurrency wallets to the platform and can select creators to support using various tokens. The system accumulates tips in batches and automatically distributes them to creators once predetermined thresholds are reached. User Interface The platform provides analytics dashboards for both tippers and streamers, displaying statistics such as tip amounts, frequency, token types used, and creator rankings. Users can track their tipping history and view comprehensive analytics about their platform activity. Purpose Streamflow was created to address what its developers view as a monopolistic practice by major streaming platforms that take commissions from creator tips. The platform positions itself as ensuring that creators receive the full amount of tips sent by their supporters without third-party deductions. Team Sahid Munjavar (https://www.linkedin.com/in/sahidm/) , Ibad Ullah Zuberi (https://www.linkedin.com/in/ibad-ullah-zuberi/) , Abdul Wasey (https://www.linkedin.com/in/abdul-wasey-ahmed-khawaja-070a71170/) https://youtu.be/WvYQmypMfFc (https://youtu.be/WvYQmypMfFc) https://youtu.be/1ZpS_ew0IlQ (https://youtu.be/1ZpS_ew0IlQ) Other Participants Xio https://github.com/EnaihoVFX/XIO (https://github.com/EnaihoVFX/XIO) Xio is a fast-paced, blockchain-powered arena game where players battle using NFT-based characters, trade assets in an on-chain marketplace, and compete on a transparent, tamper-proof leaderboard — redefining ownership and fairness in gaming.  Team:  Caymann Velingkar (https://www.linkedin.com/in/caymann-velingkar-975014207/) , Usman, Enaiho Uwas Paul (https://www.linkedin.com/in/enaiho-uwas-paul/) https://youtu.be/5bJFytgi1EE (https://youtu.be/5bJFytgi1EE) https://youtu.be/wvJYCLLZDiY (https://youtu.be/wvJYCLLZDiY) Cam Tech https://github.com/YadidyaM/radix (https://github.com/YadidyaM/radix) Cam Tech is a Radix blockchain-based intent escrow trading platform MVP.  Team:  Yadidya Medepalli (https://www.linkedin.com/in/yadidya-medepalli/) , Monica Jayakumar (https://www.linkedin.com/in/monicajayakumar/) , Amrin Asokan (https://www.linkedin.com/in/amrin-asokan/) Presentation:  https://drive.google.com/open?id=1H30OUjjUt5JyyCHgWqzepZktgvSrWP1H (https://drive.google.com/open?id=1H30OUjjUt5JyyCHgWqzepZktgvSrWP1H) Demo Video:  https://drive.google.com/file/d/1pVwxyz-GCF6j0mu-DPbS670C47yjsMqU/view (https://drive.google.com/file/d/1pVwxyz-GCF6j0mu-DPbS670C47yjsMqU/view) DApps Deployed https://stokenet.radxplorer.com/transactions/txid_tdx_2_1qzsd4k4h85yfwja6yhjy0kkz2vywctuajm773ftjnkn94zemnzsqyvp67l (https://stokenet.radxplorer.com/transactions/txid_tdx_2_1qzsd4k4h85yfwja6yhjy0kkz2vywctuajm773ftjnkn94zemnzsqyvp67l) Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a “Submission”). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. ## DApp In a Day Workshop #5 - St Mary’s University URL: https://radix.wiki/contents/history/dapp-in-a-day-workshop-5 Updated: 2026-02-08 Summary: DApp In A Day 5 was the fifth DApp In A Day Scrypto workshop, sponsored by Shardspace. It was held at St Mary’s University’s ESports Arena on the 26th of Februa DApp In A Day #5 was the fifth DApp In A Day Scrypto workshop, sponsored by Shardspace. It was held at St Mary’s University’s ESports Arena on the 26th of February, 2025. ShardSpace (https://shardspace.app/) Event Sponsor - Shardspace https://youtu.be/tKd2PQOXvUI (https://youtu.be/tKd2PQOXvUI) Details TELEGRAM: https://t.me/RadixHackathon (https://t.me/RadixHackathon) DATE: Wednesday, ****26th February, 2025 TIME: 10:00 - 17:00 LOCATION: SMU ESports Arena, Waldegrave Road, Twickenham, TW1 4SX. MENTORS: Ascarbek (http://t.me/ascarbek) (Shardspace) REGISTRATION: Eventbrite (https://www.eventbrite.co.uk/e/dapp-in-a-day-web3-workshop-workshop-6-tickets-1249180254769) https://maps.app.goo.gl/sz3zYwxhUYH196Ly5 (https://maps.app.goo.gl/sz3zYwxhUYH196Ly5) Media https://x.com/radixdlt/status/1883637405447799180 (https://x.com/radixdlt/status/1883637405447799180) https://x.com/radixdlt/status/1894087468460409046 (https://x.com/radixdlt/status/1894087468460409046) “The Web3 workshop was incredible and huge thanks to the Radix Wiki team for making it such an inspiring event. Already looking forward to the next one!” (https://x.com/dinesh_adith/status/1895484194933125286) - Dinesh “A massive shoutout to Radix.Wiki and to St Mary’s University for hosting this event in their cutting-edge Esports Arena—a perfect example of how technology is rapidly evolving!” (https://x.com/Ashmy3101/status/1895483484749377945) - Ashmy https://x.com/RadixWiki/status/1884354464225566805 (https://x.com/RadixWiki/status/1884354464225566805) https://x.com/Adam_XRD/status/1894097346947936731 (https://x.com/Adam_XRD/status/1894097346947936731) https://x.com/Radix_ecosystem/status/1891837989321424911 (https://x.com/Radix_ecosystem/status/1891837989321424911) https://x.com/RadixWiki/status/1894710939326251093 (https://x.com/RadixWiki/status/1894710939326251093) https://x.com/radixdlt/status/1910616454829105605 (https://x.com/radixdlt/status/1910616454829105605) https://x.com/RadixWiki/status/1896607305384255898 (https://x.com/RadixWiki/status/1896607305384255898) Gallery Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a “Submission”). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. ## Brilliant On Chain / DApp In A Day #4 - Brunel University URL: https://radix.wiki/contents/history/dapp-in-a-day-workshop-4 Updated: 2026-02-08 Summary: Brilliant On Chain was an event organized by Brunel University’s Blockchain Society. It was part of the DApp In A Day Scrypto workshops series and held at Brune Brilliant On Chain was an event organized by Brunel University’s Blockchain Society. It was part of the DApp In A Day Scrypto workshops series and held at Brunel University on the 4th of December, 2024. Details TELEGRAM: https://t.me/RadixHackathon (https://t.me/RadixHackathon) DATE: 4th December, 2024 TIME: 13:00 - 20:00 LOCATION: Michael Sterling Building, Brunel University, Uxbridge UB8 3PH MENTORS: Azizi A (https://www.linkedin.com/in/azizi-adeyemo/) , Abdur Razzak (https://www.linkedin.com/in/-abdur-razzak/) REGISTRATION: Eventbrite (https://www.eventbrite.com/e/brunel-hack-25-blockchain-hackathon-festival-10k-usd-in-prize-tickets-1380033640769) AGENDA: 13:00 - 13:10 - Kick-off and Welcome. 13:15 - 17:00 - Workshop with Radix: Develop and Deploy Your First Web3 Solution. 17:30 - 18:00 - Opportunities Of Steller. 18:15 - 19:00 - Panel Discussion: Brilliant On Chain. 19:00 - 20:00 - Photo Session & Networking. https://maps.app.goo.gl/Pt7UGYKrgopNJMiQ8 (https://maps.app.goo.gl/Pt7UGYKrgopNJMiQ8) Media https://x.com/RadixWiki/status/1864256487071072590 (https://x.com/RadixWiki/status/1864256487071072590) https://x.com/RadixWiki/status/1864415256359129300 (https://x.com/RadixWiki/status/1864415256359129300) “A huge thank you to [RADIX.wiki] for running an exceptional Web3 development workshop. The hands-on experience and one-to-one guidance left a lasting impression on our participants.” - Jyotirmoy “What an amazing event! It was an honour to be part of such a knowledgeable and dedicated team. Thanks to everyone who contributed to making this a success. This is just the beginning of the Brunel Society of Blockchain!” - Aman Gallery Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a “Submission”). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. ## DApp In a Day Workshop #3 - Roehampton University URL: https://radix.wiki/contents/history/dapp-in-a-day-workshop-3-roehampton Updated: 2026-02-08 Summary: DApp In A Day 3 was part of the DApp In A Day Scrypto workshops series. It was held at Roehampton University on the 4th of November, 2024. Details TELEGRAM: DApp In A Day #3 was part of the DApp In A Day Scrypto workshops series. It was held at Roehampton University on the 4th of November, 2024. Details TELEGRAM: https://t.me/RadixHackathon (https://t.me/RadixHackathon) DATE: 4th November, 2024 TIME: 10:00 - 17:00 LOCATION: DB117, Sir David Bell Building, Digby Stuart College, Roehampton Lane, London, SW15 5PU. MENTORS: @beemdvp (https://x.com/beemdvp) & @f_pieper (https://x.com/f_pieper) https://maps.app.goo.gl/4GL9WWTJKPJLuCCZ9 (https://maps.app.goo.gl/4GL9WWTJKPJLuCCZ9) Gallery Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a “Submission”). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. ## DApp In a Day Workshop #2 - St Mary’s University URL: https://radix.wiki/contents/history/dapp-in-a-day-workshop-2-st-marys-university Updated: 2026-02-08 Summary: DApp In A Day 2 was part of the DApp In A Day Scrypto workshops series. It was held at St Mary’s University on the 30th of October, 2024. Details TELEGRAM: DApp In A Day #2 was part of the DApp In A Day Scrypto workshops series. It was held at St Mary’s University on the 30th of October, 2024. Details TELEGRAM: https://t.me/RadixHackathon (https://t.me/RadixHackathon) DATE: 30th October, 2024 TIME: 10:00 - 18:00 LOCATION: Room E13, SMU ESports Arena, Waldegrave Road, Twickenham, TW1 4SX. MENTORS: @beemdvp (https://x.com/beemdvp) & @f_pieper (https://x.com/f_pieper) REGISTRATION: Eventbrite (https://www.eventbrite.co.uk/e/dapp-in-a-day-web3-workshop-tickets-1026774248417) https://maps.app.goo.gl/sz3zYwxhUYH196Ly5 (https://maps.app.goo.gl/sz3zYwxhUYH196Ly5) Media https://x.com/RadixWiki/status/1851767276346769451 (https://x.com/RadixWiki/status/1851767276346769451) Gallery “Thank you so much, the workshops were great and I learned a lot!” - L.L “Thank you! It was a great experience, you have have a great team and all so helpful! Very rare to find!” - R.B “Had an amazing day at the workshop—great sessions and fantastic connections! Huge thanks to everyone involved for making it memorable.” - Caymann “I really enjoyed developing on Radix and is something I would wanna learn more about, in your future events I can also bring some of my class mates who would be interested in it.” - S.C Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a “Submission”). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. ## DApp In a Day Workshop #1 - Westminster University URL: https://radix.wiki/contents/history/dapp-in-a-day-workshop-1 Updated: 2026-02-08 Summary: DApp In A Day 1 was part of the DApp In A Day Scrypto workshops series. It was held at Westminster University on the 28th of October, 2024. TELEGRAM: DApp In A Day #1 was part of the DApp In A Day Scrypto workshops series. It was held at Westminster University on the 28th of October, 2024. TELEGRAM: https://t.me/RadixHackathon (https://t.me/RadixHackathon) DATE: 28th October, 2024 TIME: 10:00 - 18:00 LOCATION: Room C103, 115 New Cavendish Street, London W1W 6UW. MENTORS: @beemdvp (https://x.com/beemdvp) & @f_pieper (https://x.com/f_pieper) https://maps.app.goo.gl/Mqobgt4bpEVFsNUa6 (https://maps.app.goo.gl/Mqobgt4bpEVFsNUa6) https://x.com/RadixWiki/status/1851313333153980913 (https://x.com/RadixWiki/status/1851313333153980913) Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a “Submission”). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. ## DApp In a Day Workshop #7 - Roehampton University URL: https://radix.wiki/contents/history/dapp-in-a-day-workshop-7-roehampton Updated: 2026-02-08 Summary: DApp In A Day 7 was the 7th event in the DApp In A Day Scrypto workshops series. It was held at Roehampton University on the 3rd of November, 2025 during the un DApp In A Day #7 was the 7th event in the DApp In A Day Scrypto workshops series. It was held at Roehampton University on the 3rd of November, 2025 during the university’s Career Development Week. Details TELEGRAM: https://t.me/RadixHackathon (https://t.me/RadixHackathon) DATE: 3rd November, 2025 TIME: 10:00 - 16:00 LOCATION: DB117, Sir David Bell Building, Digby Stuart College, Roehampton Lane, London, SW15 5PU. MENTORS: Avaunt (https://x.com/a_vaunt) (ShardSpace) REGISTRATION: Handshake (https://app.joinhandshake.co.uk/edu/events/55654) Media https://x.com/RadixWiki/status/1973045223212171438 (https://x.com/RadixWiki/status/1973045223212171438) https://x.com/RadixWiki/status/1985795338004021341 (https://x.com/RadixWiki/status/1985795338004021341) This week is career development week at Roehampton University and we kicked things off with an exciting Hackathon and workshop hosted by Radix, diving into their revolutionary smart contract language… | Mastaneh Davis PhD, SFHEA (https://www.linkedin.com/posts/mastaneh-davis-phd-sfhea-a4a72a17_this-week-is-career-development-week-at-roehampton-activity-7391625132991217664-QJe6) Gallery DApps Deployed https://stokenet.radxplorer.com/transactions/txid_tdx_2_1lavfe7kmhxnuz2dte4kzchwwh6krrzqvdzaefzp4u9csaxswr5tqfm87yx (https://stokenet.radxplorer.com/transactions/txid_tdx_2_1lavfe7kmhxnuz2dte4kzchwwh6krrzqvdzaefzp4u9csaxswr5tqfm87yx) https://stokenet.radxplorer.com/transactions/txid_tdx_2_1lfz032d9j3uk43peg5fzyhmse9uuhpeupwvrdgw528jl42wajjfs602q3c (https://stokenet.radxplorer.com/transactions/txid_tdx_2_1lfz032d9j3uk43peg5fzyhmse9uuhpeupwvrdgw528jl42wajjfs602q3c) Terms and Conditions The following terms apply to participation in this workshop ("Workshop"). Entrants may create original solutions, prototypes, datasets, scripts, or other content, materials, discoveries or inventions (a "Submission"). The Hackathon is organized by the Workshop Organizer. Entrants retain ownership of all intellectual and industrial property rights (including moral rights) in and to Submissions. As a condition of submission, Entrant grants the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create a derivative work from, and publicly display the Submission. Entrants provide Submissions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. PHOTOGRAPHY AND VIDEO CONSENT By participating in this Workshop, Entrants acknowledge and consent to the Workshop Organizer and its authorized representatives taking photographs, videos, audio recordings, or other media ("Media") of Entrants during the Workshop. Entrants grant the Workshop Organizer, its subsidiaries, agents and partner companies, a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to use, reproduce, adapt, modify, publish, distribute, publicly perform, create derivative works from, and publicly display such Media for promotional, marketing, educational, documentation, or other business purposes in any medium now known or hereafter developed, including but not limited to websites, social media, publications, and presentations. ## #DAppInADay Stage 1 Complete! URL: https://radix.wiki/blog/dappinaday-stage-1-complete Updated: 2026-02-08 #DAppInADay Stage 1 Complete! 6 min read The end of the beginning… After a decade of building Radix, there's no time to lose in putting its tools into the hands of tomorrow's builders. That's exactly what we set out to do in partnering with the Radix Foundation, the Radix Community Council, and RDX Works (/ecosystem/rdx-works) to run three #DAppInADay Scrypto workshops across London universities. The mission was simple: teach students to build and deploy their first Web3 applications on Radix. What we discovered after landing at Westminster, St Mary's, and Roehampton Universities was both challenging and enlightening – and might just hold the key to how Radix wins the race for Web3 adoption. Workshop Overview: Three Universities, Three Stories Each workshop followed roughly the same playbook: introduce Web3 and Radix, dive into Scrypto, then help students deploy their first dApps. Simple enough in theory – but as with all good stories, reality had other plans. Westminster: The Pionooors Our first workshop (/contents/history/dapp-in-a-day-workshop-1) at Westminster University began with around 45 students. The session kicked off with an introduction to Web3 and Radix, followed by @beemdvp's Scrypto overview and @f_pieper's deep dive into application building. Our first reality check was that the documentation really needed work. Some of the details were out of date and the layout made it confusing to distinguish between instructions for different operating systems. A second, even bigger reality check, was that outside the Crypto Twitter bubble, most computer science students are complete Web3 newcomers. Even installing Homebrew was a first for many. The mentor-to-student ratio of 1:15 proved challenging. By 16:30, we'd lost about half the group to installation hurdles. There was a silver lining however: those who persevered were so inspired that they started planning their own hackathon society. St Mary's: The Deployoors Two days later (/contents/history/dapp-in-a-day-workshop-2) , we set up shop within St Mary's stunning ESports Arena. With a smaller group (~15 students) and a better mentor ratio of 1:5, magic happened (having the Radix founder, Dan Hughes there certainly helped!) A third of students deployed to Stokenet (up from ~7% at Westminster). Some even created and shared their own NFT collections, which felt like watching Web3 adoption in real time. Roehampton's Plot Twist The third workshop (/contents/history/dapp-in-a-day-workshop-3-roehampton) at Roehampton University on the 4th of November brought two of our biggest surprises yet. Instead of the expected ~20 students, 70 showed up! This forced us to reluctantly turn away 20 due to room capacity. Then, back down to earth, a critical external dependency had been updated over the weekend, throwing our installation process into chaos. Despite Florian's heroic fix, many students had left before we could get them started. Final score: one lonely deployment to Stokenet. But sometimes you learn more from failure than success. Each workshop taught us something crucial about onboarding the next generation of Web3 developers. The RDX Works team, seeing these challenges firsthand, started working on a one-click installation solution. Because let's face it: the easier we make it to start building on Radix, the faster we'll get to that Web3 future we're all dreaming about. Key Learnings: The Good, The Bad, and The Surprising The GenZ Challenge and Opportunity Initial workshops revealed that explaining Web3 to GenZ presents unique challenges. While Bitcoin has familiarized them with digital money, concepts like decentralized operating systems still raise eyebrows. However, this challenge comes with a silver lining: today's Computer Science students typically start coding later (around age 17 versus 12-13 previously), giving them a "clean slate" advantage. Without preconceptions about EVM or smart contracts, they're more receptive to learning asset-oriented programming the Radix way. Technical Realities and Solutions The workshops exposed several critical areas for improvement in the Radix ecosystem. Documentation needs significant enhancement for newcomers, and the mentor-to-student ratio proved crucial – 1:5 works significantly better than 1:15 (big surprise!). The path from installation to testnet deployment must be streamlined, and dependency management requires careful attention. These challenges are being addressed through improved documentation and the development of one-click solutions. The results are already showing: testnet deployment rates jumped from 7% at Westminster to 33% at St Mary's with better documentation and mentoring. Growth Catalysts and Market Timing A crucial discovery is that Radix is inherently enjoyable to build with. When students deploy their first NFTs and share them with friends, the technology transforms from abstract concept to practical magic. This excitement creates natural advocates for the platform, each representing a potential catalyst for exponential ecosystem expansion. The timing couldn't be better. With Web3 adoption outpacing the internet's growth curve, these students will likely work in the industry regardless – the question is which platform they'll choose. By engaging them early, Radix positions itself as their default choice through hands-on experience rather than marketing claims. Strategic Vision The vision for scaling these workshops follows a clear progression aimed at establishing a Radix version of ETH Global, where events become self-sustaining through sponsor funding. This evolution unfolds across three strategic stages. The initial proof of concept at Westminster, St Mary's, and Roehampton validated the core concept. Next is the refinement stage, which will evolve through university-initiated workshop requests and strategic partnerships with blockchain societies. A four-tier sponsorship model will be introduced, ranging from merchandise providers to financial sponsors, while operational costs are optimized through improved catering strategies and the development of an alumni mentor pipeline. The third and final commoditization stage aims to create a globally scalable model by expanding beyond UK universities and transitioning to a professional event series with multiple sponsors. This will create a virtuous cycle where successful events at prestigious institutions attract both other universities and sponsors, while growing blockchain society networks facilitate peer-to-peer promotion. The evolution will create a sustainable model that doesn't rely on grant funding, with a target for full sustainability by the end of 2025. Each successful deployment and workshop will strengthensthe foundation for future expansion, creating a multiplication effect that benefits the entire Radix ecosystem. This strategy leverages the initial success of university workshops to build a sustainable, scalable model for ecosystem growth. By meeting developers where they are – in university classrooms across the world – Radix is positioning itself to become the default Web3 platform for the next generation of builders. The door is open for everyone who wants to help build this future. Whether you're a developer who can mentor, a university that wants to host workshops, or just someone who believes in what we're building – there's room for you in this story. They say the best time to plant a tree was twenty years ago. The second best time is now. The same goes for growing a developer ecosystem. Right now, there are students sitting in university classrooms across the country, waiting to build something amazing. We have the technology. We have the proof of concept. Now it's time to scale. 🚀 Gallery Westminster’s Pionooors St Mary’s Deployooors The Roehampton Perseverooors The Tutooors The Radix Enjoyooors The Eatooors Tip the Author ☕️ Like what you read? Support independent writing on Radix. RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | Jobs | Talent Pool ## Radix is What Web3 Noobs Think They Bought URL: https://radix.wiki/blog/radix-is-what-web3-noobs-think-they-bought Updated: 2026-02-08 Radix is What Web3 Noobs Think They Bought 6 min read The dream is real, but only Radix can deliver it. The promise of Web3 is an open-source state machine that brings permissionless finance to anyone with an internet connection. This vision is so powerful that despite being clouded by scaling bottlenecks, network outages, centralization, and complex developer experiences, it is still compelling enough for investors to have poured billions into inherently flawed networks in the vain hope that they can manifest a global finance revolution. Unfortunately, revolutions don’t happen by sheer force of will; they take foresight, planning, and innovation over many years. The only network to have made a true assessment of what the vision requires is Radix. Ten years in the making, with a unique pre-sharded (/contents/tech/core-concepts/sharding) state model, cross-shard consensus (/contents/tech/core-protocols/cerberus-consensus-protocol) and runtime engine (/contents/tech/core-protocols/radix-engine) ; even a bespoke database (/contents/tech/core-protocols/vamos-database) ; Radix was built expressly to bring the entire $400tn financial system onchain and fulfill the true promise of Web3. To paraphrase Dr. Daniel Kim (https://youtu.be/BKNK_mM_P0s) , “Radix is what Web3 noobs think they bought.” Solana: TPS for Ants The Solana Foundation recently announced (https://x.com/SolanaConf/status/1837364256414843304) that their Firedancer client can process 1m TPS. This is an extraordinary feat of engineering, but, basically, pick a meme to satirize how inadequate these numbers are. The world’s stock exchanges collectively host around 2-3m TPS. Extrapolating from research by McKinsey (https://www.mckinsey.com/industries/financial-services/our-insights/the-2023-mckinsey-global-payments-report#/) , Grand View Research (https://www.grandviewresearch.com/press-release/global-blockchain-technology-market) , Cisco (https://blogs.cisco.com/networking/iot-and-the-network-what-is-the-future) and Statista (https://www.statista.com/statistics/1101442/iot-number-of-connected-devices-worldwide/) suggests that by 2029 this could reach 20-30m, plus ~1m credit card transactions, ~20m crypto swaps, tens of millions of IoT transactions, and many millions more via AI agents. Throw in relatively new mediums such as Telegram bots and Solana is at least two orders of magnitude short of being the future of finance, without even considering factors such as induced demand (https://en.wikipedia.org/wiki/Induced_demand) or positive feedback (https://en.wikipedia.org/wiki/Positive_feedback) . Solana has chosen to focus on vertically scaling a monolithic chain, rather than horizontally via sharding (/contents/tech/core-concepts/sharding) , which is standard for modern databases, cloud services, big data technologies, search engines and content delivery networks. Radix, on the other hand, has gone all-in, pre-sharding its state model into 2^256 discrete partitions. This affords the network huge efficiency advantages in the form of parallel execution and deterministic indexing, while avoiding the data reorganization costs of dynamic sharding, and the bandwidth congestion (https://youtu.be/ZyIxWutfZ-U?t=131) of single-threaded chains. Radix’s pre-sharded state is a world away from traditional blockchains and took nearly a decade to devise, but the results are validating the approach: Dan Hughes (/community/dan-hughes) and Radix Labs recently hit 2.2m TPS (https://x.com/i/broadcasts/1MYxNMBPXeyJw) on Radix’s Cassandra (/contents/tech/research/cassandra) testnet with only 64 shard groups (/contents/tech/core-concepts/shard-groups) . Because of the peer-reviewed linear scalability (https://escholarship.org/uc/item/6h427354) of Radix’s cross-shard consensus protocol, Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , doubling the shard groups will double the TPS capacity, making it the only chain capable of onboarding the global financial system. In a 2019 interview, Solana founder Anatoly Yakovenko said, “ I don’t think we would be near the finish line if we had tried to build a sharded chain (https://youtu.be/ZyIxWutfZ-U?t=42) .” In the same interview he refers to sharding as “ a really hard computer science problem (https://youtu.be/ZyIxWutfZ-U?t=29) ”, and assumes that hardware and bandwidth improvements will outpace Web3 adoption. In our view, these are all fatal assumptions that will ultimately strand Solana on a local maximum. Firedancer will doubtless squeeze every ounce of performance from its high-spec validators, but Radix can also compound its performance with low-level, vertical optimizations but on a fundamentally more scalable substrate. Ethereum L1: Turing Complete Shambles Global capital is easily spooked so tends to settle in stable, predictable jurisdictions with a strong rule of law such as the UK or Singapore. The more lawless or capricious a country is, the less likely companies are to invest in its long-term future. In Web3, ‘code is law’, but what happens when that law is full of bugs? By design, Ethereum’s smart contract language, Solidity, is Turing complete. This gives developers maximum flexibility when writing applications, but effectively institutes a system of common law that evolves one hack or security breach at a time. Developers must thus spend 90% of their time patching every conceivable exploit in abject fear of being the next rekt.news (http://rekt.news) headline. Over time, existing applications harden and become more robust, but the pace of innovation is unnecessarily slow as every novel application or protocol standard is forced to run the same gauntlet. The situation is not helped by the fact that transaction logs are not legible to end users, placing the burden of security entirely on the shoulders of developers. Alongside Ethereum; Aptos, Avalanche, Base, BSC, Cardano, Cosmos, Fantom, Harmony, Hedera, Mixin, NEAR, Polkadot, Polygon, Ronin, Shibarium, Solana, Sui, Terra, Tezos, and Tron are (or were) all Turing complete and together account for over $8bn worth of security breaches since September 2020. Cumulative Cryptocurrency Hack Losses Over Time, log scale. Source: rekt.news/leaderboard Users hoping to colonize prosperous new network states have instead found themselves in chaotic dystopias with few enduring applications. Mercenary, short-term speculation is rampant, but beneath the froth it is likely that the inherent instability of these networks has deterred many users from building Web3 bluechips. Radix has its fair share of meme coins, but under the hood it has a much more opinionated structure that can be likened to well-written statute law. For example, assets on Radix are not defined by hand-written smart contracts, but are native to the protocol and can only exist in a finite set of states. Any attempted violations of these states will fail automatically, without developers having to think about it. Furthermore, all Radix transactions are prefaced with a human-readable manifest that clearly shows which assets will move and where, allowing the end user to form an additional layer of security. Taken together, these measures constitute a stable environment, giving developers the confidence to build for the long term, and the freedom to focus on application logic rather than security. Radix’s Transaction Manifest Ethereum L2s: Ground Control to Major F**k-up Around 2021, sharding was dropped from the Ethereum roadmap (https://ethereum.org/en/roadmap/) in favor of scaling via external aggregators known collectively as Layer 2s (L2s). At the time, L2s were likened to shards and the pace of development led many to believe that certain limitations of the architecture could be overcome. Three years later, L2s are magnitudes slower than next generation chains like Solana and Radix, but their biggest flaw by far is that they have fragmented the liquidity, composability, and narrative of the Ethereum ecosystem. The magic of decentralized finance (DeFi) for consumers is that on Layer 1 (L1) chains, applications and pools of capital can be composed like Lego bricks into more complex structures and utilized in a single transaction. Most users might not realize, but over 70% of Ethereum transactions touch three or more smart contracts (https://x.com/PiersRidyard/status/1583014459467239426) . L2s ruin this harmony by introducing trust boundaries (/contents/tech/core-concepts/trust-boundary) between themselves and the L1, meaning that assets and applications on one L2 can’t natively communicate with those on other L2s or the L1. Strategies do exist for users to bridge assets between the various realms, but all of them ultimately break the atomic commitment (/contents/tech/core-concepts/atomic-composability) that exists between native entities, opening the door for assets to be stranded if one step in a transaction were to fail. It’s absurd to imagine that in the ruthlessly competitive world of mainstream global finance, such an obvious risk would be tolerated, but the Ethereum Foundation are so bereft of solutions that they have let the situation metastasize. In contrast, Radix has insisted on the strongest possible composability guarantees because anything less will not be enough for Web3 to be taken seriously. It’s Decentralization, Stupid Whether it’s Radix’s state model, scripting language, or runtime engine, they are all ultimately subordinate to one thing: decentralization. We already have a financial system that is generally fast and cheap, but every year our purchasing power is debased without due process, and regulations such as ‘accredited investor’ rules stifle our ability to combat it. Central governments say to their citizens, “We don’t trust you. We’ll run the software.” And they do: the software of money, association, education, healthcare, etc. Web2, as an extension of government, says the same. Bitcoin showed that these problems can only be addressed by ultimately removing monetary policy from centralized control. Web3 takes this innovation and builds on it with a complete network state that trusts its citizens to run the software, fostering a culture of ownership, honor, dignity, and belonging. Technicians tend to discount these considerations as immaterial, but history shows again and again that decentralizing power is essential for prosperity. Even the threat of a User Activated Soft Fork of Bitcoin in 2017 proved that ultimately, Layer 0 is sovereign, not Layer 1. Decentralization also happens to create an extremely hospitable environment for capital, being immune to the friction of national borders, different regulatory environments, credit checks, KYC requirements, etc. This being the case, the most successful network state will be the one that effectively decentralizes power to its users and harnesses both the energetic culture it creates and safety and freedom it affords. Ultimately, Radix's architecture has been optimized for both the social and economic benefits of decentralization. Large, energy hungry validators are easy to identify and shut down but Radix’s 2^256 shards can be run on devices as small as smart watches and IoT devices, making the network both highly participatory and highly resilient against centralized control. While many Web3 newcomers believe they've invested in a trustless world computer capable of disrupting global finance, the reality is that only Radix has taken the necessary steps to make this dream a reality. In the coming months and years, users and capital will migrate to the only network state without bottlenecks, bandits, or trust boundaries, and build the most liquid and productive economy the world has ever seen. RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | Jobs | Talent Pool ## Money, Wealth, and Volcanos URL: https://radix.wiki/blog/money-wealth-volcanos Updated: 2026-02-08 Money, Wealth & Volcanos 4 min read Crypto is an exciting new frontier but to understand it you first need to understand money. This is how the world works: FIRST, LOOK AROUND YOU: Look at the crumbling brickwork, potholes, shuttered stores, brownfield sites, peeling wallpaper and threadbare carpets. There’s no shortage of work. But there’s no money. WHERE HAS ALL THE MONEY GONE? The financial district looks like it has plenty of money. The Forex markets swap $5.1tn worth of currencies every day. But it’s a shell game. To quote Peter Thiel: “Finance is the only way to make money when you have no idea how to create wealth.” We don’t lack money, we lack wealth. Money is not wealth. Wealth is peace, love; and yes, private jets, caviar, and pothole-free roads. SO, WHAT IS MONEY? Money is a measurement of wealth. Just like centimeters, kilograms, and mph. Consider this thought experiment from Dan Robinson: - Throw $1000 into a volcano. - ➡️ Every remaining dollar measures fractionally more. - Throw a cow into a volcano. - ➡️ The world has lost a productive asset. MONEY IS NOT A REAL ASSET The value of wealth never changes: a cow is a cow; a house is a house. USD, Euros, gold, and Bitcoin are all the same: not assets, just different ways to measure wealth. Throwing $1000 into a volcano simply transfers their value to the remaining dollars. No wealth is lost. Dan’s purchasing power is just transferred to other dollar holders. THIS ALSO WORKS IN REVERSE! If the Federal Reserve prints $1,000,000,000,000, purchasing power is transferred away from other dollar holders… Printing money doesn’t increase real resources any more than printing theatre tickets increases the number of seats. OK… ONE QUESTION: “If money is like centimeters and kilograms, shouldn’t we all use the same measurement and make sure those dimensions don’t change?” The obvious answer is “Yes!” THE DIMENSIONS OF MONEY ARE ITS SUPPLY Say a house measures $100,000. Double the supply of dollars (i.e. halve the measurement size) and the house now measures $200,000. HOW MUCH ARE THE DIMENSIONS CHANGING? This much: US M2 Money Supply consisting of cash, checking accounts, savings accounts, and money-market funds. Since 1990, the dollars we use to measure wealth have increased 560%. HOW?! Commercial banks, moderated by the Federal Reserve, create new money whenever they issue loans. WHAT DOES THIS MEAN? Most people don’t know how to measure wealth accurately. Those who do know can take advantage of those who don’t: Step 1: Borrow money. Step 2: Swap for real wealth (houses, etc) before the owners realize the new measurement. IS THERE A NAME FOR THIS PROCESS? Take your pick: - Theft. - Fraud. - Abomination. - Monetary Policy… - Arbitrage… WHAT ABOUT THE POTHOLES? Borrowing money to flip assets is just too darn easy. Why study potholes when finance is rigged to win? Interest rate and currency arbitrage are so lucrative that they have bewitched the brightest minds and drawn enormous amounts of capital away from the difficult task of making the world a better place; i.e. creating wealth. A PATH FORWARD: So, we've diagnosed the illness. But what's the cure? 1. ASK THE RIGHT QUESTION “What causes poverty? Nothing. It’s the original state, the default and starting point. The real question is, What causes #prosperity?” - Per Bylund 2. FINANCIAL LITERACY: Ignorance is the enemy. We can't change the game if we don't know the rules. Understand how money, finance, and economics work. Then share what you’ve learned! "Knowledge itself is power." - Francis Bacon 3. INVEST IN REAL WEALTH: Invest in health, education, sustainable technology, infrastructure, arts. These are real sources of wealth. They make our lives better and contribute to the long-term wellbeing of our planet. 4. INNOVATE: What if we rethink money? New forms of money like Bitcoin, Ethereum and Radix are already challenging the status quo. They're not perfect yet, but they're a start. Let's experiment, innovate, and find better ways to measure and exchange wealth. 5. FAIR DISTRIBUTION: Wealth isn't just about accumulation. It's about distribution. A society where wealth is concentrated in the hands of a few is a society on the brink. Let's strive for a broad distribution of wealth. A FINAL THOUGHT: Money isn't evil, but debasing it is. Money is a tool. A tool we've created. And if we've created it, we can make it better. RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | News | Jobs | Talent Pool ## Radix Is Florence URL: https://radix.wiki/blog/radix-is-florence Updated: 2026-02-08 Radix is Florence 7 min read Web1, 2 and 3 have all been compared to the Industrial Revolution because each has introduced new systems and organizations with the potential to supercharge global productivity. However, Web3 is distinct and its characteristics place it closer to the earlier Commercial Revolution of the 10th to 14th centuries than the more famous 18th century era. Maps Before dApps The first Industrial Revolution brought us the internal combustion engine, mass production, and an exponential increase of mechanical precision. The late 19th century Belle Époque saw an even more elaborate wave of innovations, including central heating, aeroplanes, plastics, computers and heart surgery. Having waited 5000 years for the farming revolution, a brand new consumer class emerged within 500, and agriculture receded as a smaller and smaller component of the global economy. Yet, as the Renaissance built upon the infrastructure and freedoms won during the Reformation, so the Industrial Revolution wouldn’t have been possible without the groundwork that was laid by the Commercial Revolution. That time wasn’t characterized so much by invention, as by increasing access to existing goods and services. We see exactly the same situation in Web3 today: A loud chorus of commentators are noting the lack of novel utility and consumer applications, but this is like demanding aeroplanes before the invention of the lathe. Broad consumer adoption won’t happen before transaction costs drop below marginal utility, and that still requires Web3 infrastructure to mature and cover the current economy’s uncharted hinterlands. As this process unfolds, we are seeing a steady churn of applications migrating from network to network in search of the most efficient delivery system for their goods and services. The Florence Protocol The Commercial Revolution began in the 10th Century when European traders carved out trade routes to the Levant, Spain, and North Africa. European agricultural societies had developed a taste for Byzantine luxuries and Eastern spices, inspiring merchants to overcome geographical, cultural, and administrative barriers to supply their customers. Principal city-states like Siena, Pisa, and Venice became the epicenters of this commercial boom by leveraging their existing liquidity, political stability, and juridical systems to facilitate trade and reduce transaction costs through contract law, property rights, monetary exchange systems, and the advent of shipping, insurance, and marketing mechanisms. Among all of the Tuscan city states, Florence was arguably the most successful. Between 1071 and 1284 the city enlarged its walls three times and grew its population ~25x. Even after the devastation of the Black Death in 1348, the city went on to conquer Pisa and acquire direct access to the Mediterranean via the ports of Pisano and Livorno. Despite - or perhaps because of - its inland location and pastoral heritage, Florence’s advantage lay in its people’s willingness to get their hands dirty. In the years preceding, its merchants had established a network of companies abroad engaged in commerce, banking, and government finance that together operated as a unified trade network. In contrast, despite having vast iron deposits, rich soil, and chief stewardship of the papal accounts, Siena’s merchant bankers didn’t share Florentines’ aggressive entrepreneurship. As Florence grew, Siena couldn’t compete: its main bank collapsed in 1298 and its bankers retired as rentier landowners. Looking across the landscape of Web3, the furnaces of infrastructure are still red hot. The Ethereum (/contents/tech/comparisons/radix-vs-ethereum) ecosystem alone is incubating ~50 rollups as satellites around its core technology. The price premium given to Layer 1 (L1) networks suggests that some might venture out on their own to join a growing list of existing competitors, including Radix, Solana, Cardano, Avalanche, and Aptos. Each network is like a city state, racing to plug the rest of the world into its standards and trade routes. Arguably, Bitcoin has been the most successful at colonizing the ruins of failing currencies, but its poor utility has largely failed to quicken the hearts of many expansionist merchants. Most conversations in Web3 center around the monetary qualities of native assets such as $ETH and $BTC. Yet Florence only introduced a standard unit of account - the Florin - in 1252, almost at the apex of its expansion, after most of the infrastructure work had been done. By 1300, the Florin had become the global standard, gilding in 50 years the trade routes that had taken at least three centuries to build. After 1400, various governments started minting their own competitors to the Florin, eventually driving it out of circulation, but that was only possible because the Florence protocol had already won: the trade routes, commercial treaties, banking network, and legal system were in place as a rich substrate for merchants to grow their businesses. The Radix Protocol Like Florence, Radix is also a late developer, having clocked up over ten years of research (/contents/history/history-of-radix) to arrive at its current form. During that time, the denizens of Web3 have looked beyond mere store of value toward financial luxuries that were previously only available offshore or to accredited investors. Ethereum and other smart contract platforms have brought exotic assets, decentralized organizations, and financial tools to our desktops and smartphones, but none in a sufficiently frictionless way to convince the wider economy that their trade network is a viable one to follow. The obstacles faced by these early pioneers center around the architecture of the protocols themselves, beginning with the all-important developer experience. Developers are the early pioneers of Web3 protocols, seeking a protocol to host their inventions as quickly and safely as possible. The complexity of early protocols like Ethereum led to over $3bn worth of hacks (https://rekt.news/leaderboard/) in 2022, forcing developers to spend ~90% of their time on safety considerations rather than shipping products. Such a drain on resources makes Ethereum and protocols that borrow its technology uncompetitive. One stumbling block of the Ethereum protocol is to keep token balances on vast tally charts called token contracts. Every transaction involves updating the sender and receiver accounts in the contract and wallets must query every token contract to display the correct amounts of every asset. In contrast, Radix is built around the intuitive concept of native assets (https://learn.radixdlt.com/article/what-are-native-assets) that are held in smart accounts (/contents/tech/core-protocols/smart-accounts) , making applications much easier to design and safer to manage for aspiring builders. As well as a more rational architecture, Radix’s smart contract language, Scrypto (/contents/tech/core-protocols/scrypto-programming-language) , has been built expressly for the needs of decentralized finance (DeFi). Scrypto is simple enough for complex projects such as Trove (/ecosystem/trove) to be brought to market by a single developer (https://www.radixdlt.com/blog/runs-on-radix-q-a-trove) and for Shardspace (/ecosystem/shardspace) to roll out an advanced management suite a mere five weeks after the Babylon (/contents/tech/releases/radix-mainnet-babylon) launch. On the UX side, the Radix Wallet (/contents/tech/core-protocols/radix-wallet) allows for simple encrypted connections to Radix applications, without the need for third-party connectors. Within the wallet, users can create and customize Personas (/contents/tech/core-protocols/personas) to share as much information as they want with applications or withdraw it at the touch of a button. Finally, the Transaction Manifest (https://learn.radixdlt.com/article/what-are-transaction-manifests) gives a clear, plain-English summary of every transaction, complete with a mechanism to void transactions that fall below an expected inbound amount, for example, with an asset swap. Radix’s decade of development was arduous, but it was necessary to build a technical foundation that can truly host global commerce. Now we are beginning to see merchants colonize it with applications and users. The Radix Renaissance 700 years on, the blood, sweat, and tears of the Commercial Revolution is now largely forgotten because it was the birth pangs of a period of human flourishing so profound that its surviving artifacts are still among the most valuable on Earth: the Renaissance. This is the bull case for Radix: eventually, the rhetoric of architecture and infrastructure will defer to a babble of societal and cultural flourishing. The Radix Renaissance is more than just a metaphor; it is a movement towards an era of unprecedented digital prosperity; a future where digital commerce flows with the relative ease and security of the legendary Tuscan city-state, transforming rhetoric into reality and architecture into civilization. RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | News | Jobs | Talent Pool ## POW vs POS: The Next Industrial Revolution URL: https://radix.wiki/blog/pow-vs-pos-the-next-industrial-revolution Updated: 2026-02-08 PoW vs PoS: The Next Industrial Revolution 7 min read The first Industrial Revolution saw society transformed by the power of coal. This former king of resources fueled the engines of progress, but as our understanding evolved, we saw coal’s darker side: its dirtiness, inefficiency, and environmental toll. A similar narrative has unfolded in crypto: Proof of Work (/contents/resources/python-scripts/proof-of-work) (PoW) - the mechanism behind Bitcoin - is the coal of the digital age - its role at the inception of crypto is undeniable, but so are its limitations… The most prominent alternative to PoW so far is Proof of Stake (PoS) (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) – if not renewable, then the natural gas to PoW's coal. As PoS gains momentum and market share, we have seen variations of it such as Delegated Proof of Stake (DPoS) (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) , which is employed by Radix and other networks. This evolution, akin to refining our energy sources, promises enhanced scalability and inclusivity. As we delve deeper into the dynamics of PoW, PoS, and DPoS, we'll see how these shifts move the decentralized web toward a more sustainable, efficient, and democratized digital future. Sybil Prevention PoW and PoS differ in their methods but both have the same aim: to impose a prohibitive cost on gaining majority control of a network like Bitcoin, Ethereum or Radix - a situation broadly known as a Sybil attack. In exchange, both methods reward participants for their efforts, usually in the network’s native asset. In this context, ‘control’ refers primarily to the consensus mechanism - the method by which the network agrees which transactions are valid. In PoW this is achieved through mining, and in PoS by running a validator. As transactions are created by users, approving them amounts to a vote, meaning that if an attacker were to gain over 50% of the voting power, they could fabricate and approve any number of beneficial transactions. A sustained attack of this kind would eventually destroy the integrity of a network and the value of any assets it hosts - defeating the object - but exiting an attack before that point could net a significant payout for anyone with the means to do so. Apart from consensus, the other way to participate in a network is by hosting a node that audits the work done by miners or validators. Node operators might seem ancillary, but collectively they wield enormous power by deciding which version of the consensus software is canonical. Miners and validators also run nodes, but if a majority were to decide to run software that rejects legacy transactions, it would be unviable for miners and validators to depart from the economic majority of the network by doing otherwise. There are then, two distinct types of Sybil attack in crypto networks: one, by dominating the consensus mechanism within the existing rules; and another, a critical mass of nodes issuing new rules, either by social consensus or a proliferation of malicious nodes. PoW - Computational Friction 🔢 PoW introduces computational friction by requiring miners to notarize each block of transactions with a unique cryptographic stamp called a ‘block hash’. Hashes are alphanumeric strings but the twist in PoW is that it requires hashes to begin with a number of leading zeros, with each additional zero making the hash exponentially rarer and more computationally expensive to find. Because the transaction data itself is static, miners must append extra data by cycling through binary numbers, hashing each one together with the transaction data until the resulting hash has the required number of leading zeros. 💡 The simplest way to understand PoW is to try it yourself! Here (/contents/resources/python-scripts/proof-of-work) is a simple PoW script you can run at home. 💡 Hashing a block 10^16 times (roughly the current requirement for Bitcoin) requires millions of dollars worth of hardware and electricity. However, once found, hashes are trivial to validate, meaning that if a miner were to admit a fraudulent transaction, their block would be rejected and their resources would have been wasted. The positive case is ‘consensus’, specifically ‘Nakamoto Consensus’, which is when the network of miners trusts a block enough to build another one on top of it. The more cumulative PoW there is on a chain, the more trustworthy it is and the more likely it will continue to be built on by miners. PoS - Economic Friction 💸 PoS is conceptually similar to PoW, but instead of staking computational resources, validators are required to stake the native network tokens, which are forfeited if one of their blocks is found to be invalid or if the validator otherwise behaves dishonestly or incompetently. PoW vs PoS 🥊 Comparisons between PoW and PoS center around three properties: economic security, efficiency, and decentralization. Economic Security & Efficiency PoW’s simple implementation and use of real resources makes it highly secure. This robustness, however, comes at a cost: PoW systems have been strongly criticized for their inefficiency and substantial energy consumption. While critics of the latter rarely offer fair comparisons with networks such as the global banking system, PoW’s inefficiency is less easy to defend: its competitive and serial nature means that every participant must race to hash every block, despite there only ever being only winner among thousands of miners. As an illustration, Ethereum researcher Justin Drake (https://twitter.com/drakefjustin) estimated in March 2022 that switching to PoS made Ethereum 3.5x more economically secure than Bitcoin and 33x more efficient: In addition to being more economically efficient at securing value than PoW, PoS is also more capital efficient because it doesn’t require the consumption of real-world assets: In the above tweet, Dan Robinson (https://twitter.com/danrobinson) is arguing that even though both PoS and PoW may be costly on an individual basis, only PoW is costly to society in its use of real world resources. For a deeper dive on this topic, check out our article on Money, Wealth and Volcanos (/blog/money-wealth-volcanos) . Decentralization Decentralization is a multifaceted issue, encompassing geographical, socio-economic and technical considerations. Geographical Considerations On the geographical aspect, PoW mining farms are conspicuous because of the amount of electricity their hardware consumes. They are also tend to locate to areas where electricity is cheapest. Both of these factors make PoW systems vulnerable to national sanctions of the type seen during 2021 (https://www.cnbc.com/2021/06/15/chinas-bitcoin-miner-exodus-.html) in China. In contrast, PoS systems require minimal external resources and lower up-front capital costs, which means that validator software can be run more discreetly on consumer hardware by anyone with an internet connection, facilitating geographical - and political - decentralization. Socio-economic Considerations Although mining rigs and validators both require a level of expertise to operate, the simpler implementation of PoS caters to a broader gamut of users than the specialized hardware needed for PoW. On the other hand, PoS has been accused of precipitating wealth concentration, since those with more tokens have a higher chance of being selected to validate transactions and earn rewards, allowing them to compound their holdings over time. PoW mining is designed such that the cost of production remains roughly equal to the value of new coins (a rising price attracts more mining resources, which increases the mining difficulty, and thus the cost of production). Staking doesn’t have such a mechanism, but its lower accessibility arguably has the same effect by enabling new stakers to arbitrage high yields, effectively pegging them close to the cost of production. Technical Considerations Most other comparisons between PoW and PoS tend to overlook the presence of mitigation strategies or the parallels between mining equipment in PoW and staked assets in PoS. For example, in a PoS system, those who hold a larger stake in the network's tokens have a higher probability of being chosen to add a block to the blockchain. This is analogous to PoW systems, where participants with more powerful mining hardware stand a greater chance of successfully mining a block and adding it to the blockchain. Another distinct challenge in PoS systems is the ‘nothing at stake’ problem. This occurs in systems where validators define consensus in the event of a disagreement as building on the longest chain (Nakamoto Consensus), meaning the option with the most support from validators. Unlike PoW systems, it is trivial for validators to hedge their bets by voting on several versions of the chain, creating an issue known as the 'nothing-at-stake' problem. DPoS on Radix Radix employs a variation of PoS known as Delegated Proof of Stake (DPoS). In this system, instead of staking directly, native token holders can delegate their stakes to validators, who run the validating software on their behalf and compete on the basis of their fees and reliability. This arrangement allows for a more diverse participant pool than traditional PoS systems by enabling users who don’t have the technical expertise or sufficient holdings required to become a validator to still participate in network security and consensus. DPoS though, is not without its own challenges. One of the most significant is the risk of centralization. If a small number of validators end up controlling a majority of the delegated stake, the network could become vulnerable to a coup. To guard against this, DPoS also implements the slashing conditions mentioned above, although these have not yet been introduced on Radix. Validators are responsible for securing the delegated stakes and ensuring they act in the best interest of the network. Any misbehavior on their part could lead to significant losses for the delegates. As such, delegates must be careful in choosing trustworthy validators. Despite these challenges, DPoS systems like Radix provide an efficient and inclusive alternative to traditional PoW and PoS mechanisms. It is also worth noting that Radix’s consensus protocol - Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , does not reference the longest chain, so avoids the nothing-at-stake vulnerability of Nakamoto Consensus. Conclusion While PoW has served as the foundation for cryptocurrencies, its substantial energy consumption, economic inefficiency, and geographic centralization, are anachronisms that will inevitably be jettisoned by a new generation that demands higher standards. As the digital age evolves, the transition to PoS and DPoS marks a shift towards more sustainable and efficient ledgers like Radix. Mirroring the journey from coal to cleaner energy sources, this evolution underscores the industry's commitment to addressing the limitations of its predecessors. With PoS and DPoS, the blockchain world is moving towards a more inclusive, scalable, and environmentally friendly future, promising a decentralized web that not only reduces its ecological footprint but also democratizes participation. As we navigate this transition, the promise of a more efficient and equitable digital ecosystem heralds a new era of technological progress grounded in sustainability and inclusion. RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | Jobs | Talent Pool ## RGH2024 Debrief URL: https://radix.wiki/blog/rgh2024-debrief Updated: 2026-02-08 RGH2024 Debrief 5 min read Over the April 20th weekend, while most of crypto was literally treading water in Dubai, we brought ~60 people for a next-gen hackathon on the Radix stack. Here’s the background, what was built, and what we learned. Protocol Revolution → Publishing Revolution Before we get to event, here’s a bit of context. Throughout history, protocol revolutions have been reliably followed by publishing revolutions. From the 20m books that rolled off of Gutenberg’s 1440 printing press, to the ~50bn web pages enabled by TCP/IP, HTTPS, and APIs. Radix is likewise a protocol revolution: Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , the Radix Engine (/contents/tech/core-protocols/radix-engine) , Scrypto (/contents/tech/core-protocols/scrypto-programming-language) , and the numerous other protocols (/contents/tech/core-protocols) that constitute Radix are leading to a publishing revolution where anyone around the world is able to build and launch a performant Web3 application. At this stage of the Web3 publishing revolution, developers are the main customers of blockchains and distributed ledgers. And like any customer thinking about a long-term commitment of capital or resources, they want to kick the tires before they start the long process of building an application. Usually held over a weekend, hackathons are an ideal vehicle to expedite this process. They give developers a chance to build a Minimum Viable Product (MVP) with help from mentors, while being fed and watered in the company of like-minded collaborators with whom they can explore the potential of a technology. During such events, developers are able to ask vital questions such as: “Is this technology viable”? “Is it future-proof?” “Is there adequate documentation?” “How quickly will I be able to take a product to market?” Radix is a Web3 infrastructure layer that has been built with these questions in mind. For example: - Assets on Radix can only exist in a small number of permitted states, freeing developers from the stressful, time-consuming task of trying to specifically exclude undesirable states. - Like a physics engine in the gaming world, Radix’s execution layer - the Radix Engine - abstracts away the logic and behavior of assets, leaving developers free to focus on application and business logic. In contrast, developers on the Turing-complete Ethereum blockchain spend ~90% of their time (https://x.com/PiersRidyard/status/1360651370676838403) on security. RADIX.wiki (http://RADIX.wiki) 🤝 ArtSect With a mission to accelerate the Web3 publishing revolution and building on the success of our Babylon Launch Party (/blog/a-year-in-review-2023) , we organized a Radix Global Hackathon in collaboration with the East London Decentralized Art Organization, ArtSect (https://www.artsect.xyz/) . The primary objective was to introduce developers to the Radix stack but also to strategically raise the visibility of Radix within one of the most vibrant and prolific Web3 art communities in the world. The event began on Saturday April 20th at 10am and continued throughout the night until around 6pm on the Sunday. Over the two days we had around 60 people through the door, ranging from experienced full-stack developers to absolute beginners. Experience 🤝 Learnings Aside from the logistical challenges of creating a harmonious happening of space, wifi, and food, here are a few things we learned during the event: - The Scrypto effect is real. Nearly all of the developers commented on how quickly they were able to build and a few - having never touched Scrypto before - were up to speed within a couple of hours. Richard Jarram (https://www.linkedin.com/in/richardjarram/) , GeoChain - A thorough presentation and a slice of pizza are enough to red-pill new developers. Doing this at scale would have the highest ROI of any marketing initiative we can think of. - The prize money is way less important than we realized. Most participants signed up before the prizes were announced and hardly any of them mentioned it during the event. Building something cool was their primary objective. - University blockchain societies are hosting events every week and always on the lookout for interesting developments to showcase. - Beyond Bitcoin, the valence of networks and loyalty of participants is much weaker. Several participants wearing Solana t-shirts simultaneously said how difficult it is to develop on! - Huge industries are still actively looking for where to build in Web3. We spoke to founders of gaming, fashion, and art businesses all waiting for the right infrastructure to deploy on. In the end we had 7 great submissions. Here are the results and presentation videos: 🏆 Results 🥇 Beginners’ Track Winner - 🗺️ Geochain geochain (https://github.com/metalogica/geochain) Geochain securely tracks and manages shipments in real-time using Radix for a transparent and efficient supply chain. Team: Richard Jarram (https://www.linkedin.com/in/richardjarram/) , Mariana Oka (https://www.linkedin.com/in/mariana-oka/) Presentation: https://docs.google.com/presentation/d/1NHJWIptTSpNwz4_rQeq7qeKeIL7GeEcEXz9H9wb7EwI/edit#slide=id.g26f1ee450e0_0_322 (https://docs.google.com/presentation/d/1NHJWIptTSpNwz4_rQeq7qeKeIL7GeEcEXz9H9wb7EwI/edit#slide=id.g26f1ee450e0_0_322) 🥈 Beginners’ Track Silver - 📊 Radix Stats radixstats (https://github.com/dcts/radixstats) Radix Stats provides free and open token analytics for Radix, enhancing transparency and credibility. Team: Thomas Starzynski (https://www.linkedin.com/in/thomas-starzynski/) , Mauro Jose (https://www.linkedin.com/in/maurojose) RadixStats-slides.pdf4028.5KB (https://assets.super.so/2a0d12a5-9454-4417-b17e-d51617137fb6/files/35b8e60c-0557-4bcb-afe6-9c6b2a913525/RadixStats-slides.pdf) Radix Dashboard (https://radixstats.web.app/?resource=resource_rdx1t5xv44c0u99z096q00mv74emwmxwjw26m98lwlzq6ddlpe9f5cuc7s) Explore token stats on Radix (https://radixstats.web.app/?resource=resource_rdx1t5xv44c0u99z096q00mv74emwmxwjw26m98lwlzq6ddlpe9f5cuc7s) radixstats.web.app (http://radixstats.web.app) 🥉 Beginners’ Track Bronze - 🕺 Jouna-Wyliometer jouna-wyliometer (https://github.com/jameswylie/jouna-wyliometer) The Jouna-Wyliometer is a price sentiment tool that uses a 3D avatar who, via the Astrolescent API, reacts according to the current $XRD price. Team: Jouna Lansman (https://twitter.com/JounaLansman) Wyliometer - Price Sentiment Tool (https://jouna-wyliometer.pages.dev/) 🥇 Advanced Track Winner - ∞ InfiniX RadixScryptoChallenge---InfiniX (https://github.com/AbdurRazzak01/RadixHack) AbdurRazzak01 ⋅ 2 years ago (https://github.com/AbdurRazzak01/RadixHack) InfiniX offers customizable loss limits, instant fund settlements, and parametric insurance integration for DeFi investments. Team: MD Abdur Razzak (https://www.linkedin.com/in/-abdur-razzak?miniProfileUrn=urn%3Ali%3Afs_miniProfile%3AACoAADmcRukBZBWcEc42TGiCbxkH1K0Tn9QAK0k&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_all%3BHakxM5kLQl6Ta%2FV2gp3MvQ%3D%3D) , Kishan Marsonia (https://www.linkedin.com/in/kishan-marsonia) InfiniX (https://www.canva.com/design/DAGDCJiRn0g/jvYaaVqe4WQasc20wUjRYw/edit?utm_content=DAGDCJiRn0g&utm_campaign=designshare&utm_medium=link2&utm_source=sharebutton) 🥈 Advanced Track Silver - 🤖 AI Invest aiin (https://github.com/muhaj/aiin) AI Invest provides an accessible chatbot interface with real-time stock and crypto data, guided purchases, and engaging UI for simplified trading. Team: Jetnor Muhaj (https://www.linkedin.com/in/jetnormuhaj/) AI Invest Radix Chat bot (https://www.canva.com/design/DAGCX1Ol78M/MXBVmGt_t_ucxpUkPbISPw/view?utm_content=DAGCX1Ol78M&utm_campaign=designshare&utm_medium=link&utm_source=editor) Check out this Presentation designed by Jetnor Muhaj. (https://www.canva.com/design/DAGCX1Ol78M/MXBVmGt_t_ucxpUkPbISPw/view?utm_content=DAGCX1Ol78M&utm_campaign=designshare&utm_medium=link&utm_source=editor) 🥉 Advanced Track Bronze - 🥊 Top Scorer / FitClash fitclashdemo (https://github.com/fitclash/fitclashdemo) 2 years ago (https://github.com/fitclash/fitclashdemo) Top Scorer / FitClash is a fitness app that allows friends to wager each other with $XRD in various physical activities with real-time AI tracking and leaderboards. Team: Justin Grierson (https://www.linkedin.com/in/justin-grierson) Advanced Track Runner Up - 🎨 PiciX PiciX is a daily AI art challenge inspired by Wordle, where participants create image prompts to match a provided AI-generated image, with the winning creation minted as an NFT and sent to the winner's Radix wallet. Team: MD Abdur Razzak (https://www.linkedin.com/in/-abdur-razzak?miniProfileUrn=urn%3Ali%3Afs_miniProfile%3AACoAADmcRukBZBWcEc42TGiCbxkH1K0Tn9QAK0k&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_all%3BHakxM5kLQl6Ta%2FV2gp3MvQ%3D%3D) , Kishan Marsonia (https://www.linkedin.com/in/kishan-marsonia) In the end, this event validated our thesis that facilitating collaborations is one of the most valuable activities we can do for the Radix ecosystem, and the faster we can do it, the more it will compound. A huge thanks to everyone who participated and helped out, especially @a_vaunt (https://twitter.com/a_vaunt) , @wyliepieote (https://twitter.com/wyliepieote) , @beemdvp (https://twitter.com/beemdvp) , @Radstakes (https://twitter.com/Radstakes) and David Cuellar for the red-pilling and bug fixing, @ArtSectDAO (https://twitter.com/ArtSectDAO) for their 36hr straight technical support, and @JustEatBusiness (https://twitter.com/JustEatBusiness) , Dorothy's Deli and PAUL ( http://instagram.com/dorothysdeli (http://instagram.com/dorothysdeli) / http://instagram.com/paul_bakeryuk (http://instagram.com/paul_bakeryuk) ) for the food. Check out the event page here for more details, and see you all next year! RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | Jobs | Talent Pool ## A Year in Review 2023 URL: https://radix.wiki/blog/a-year-in-review-2023 Updated: 2026-02-08 A Year in Review - 2023 2 min read Here are a few of our closing thoughts on an amazing year for RADIX.wiki (http://RADIX.wiki) and the Radix ecosystem in general. At a high level, our visitor numbers grew 30% per month and hailed from nearly 100 countries, suggesting that awareness of Radix is speed-running the paths already laid down by Bitcoin, Ethereum and other pioneers. Highlights from 2023 While a few of the things we’ve tried haven’t worked (Forum anyone?), a lot of initiatives have turned out better than we could have hoped: - Generated over 20k pageviews from 50+ articles covering Radix technology and related topics, as well as curating 100+ project and talent profiles. - Grew our Twitter / X following 10x with the help of our partners like Nelly Sayon (https://twitter.com/NellySayon) and London Crypto Club (https://twitter.com/LondonWeb3Club) as well as podcasters like Jspeak (https://twitter.com/JCRYPTO_YT) . On the business side, we listed 50+ open positions on our Jobs page to connect talent with opportunities. Linking job listings to projects in April also helped to increase the collision rate. In June, the team doubled (!) when James Smith was added to the team. James grew RDX Works from 55 people to 110 in <12 months as well as driving their employer branding and recruitment marketing strategy. He is now in charge of business development as well as the recruitment side of the wiki. 🔥 As part of our commitment to the ecosystem, we joined the #RunsOnRadix program in July. We also hosted a successful Babylon mainnet launch party in London attended by ~100 people. Shardspace gave live demos showcasing the technology. Babylon launch party October was a breakthrough month for us, smashing our previous records in pageviews and maintaining our growth trajectory. We also acquired a significant portion of the $EMOON supply, adopting it as a governance token and thereby avoid any market dilution. Additionally, we introduced a News page, offering rolling updates from the Radix ecosystem and integrated CaviarNine (/ecosystem/caviarnine) ’s token swap widget onto our front page. November was marked by the inception of our NFT collection, starting with four from the 'God Eater Cerberus' collections on Impahla. God-Eater Cerberus #45 The Stage Is Set for an Exciting 2024 While some Layer 1 competitors gained traction in 2023, we believe Radix and its community will break through in 2024. With its superior tech and passionate builders, Radix has laid the foundation to become a leading smart contract platform in the space. As we enter 2024, the Radix ecosystem has incredible developer momentum. With core infrastructure in place after Babylon, we are seeing projects deliver real utility just three months after mainnet launch. We are more excited than ever about the future of Radix. To continue building community and drive adoption, we plan to organize a Global Radix Hackathon in 2024 👀. Thank you for being part of our 2023 journey. Here's to an amazing year ahead! 🎉 💙 RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | Jobs | Talent Pool ## Ten 10x Moments Coming to Web3 URL: https://radix.wiki/blog/ten-10x-moments-coming-to-web3 Updated: 2026-02-08 Ten ‘10x Moments’ Coming to Web3 8 min read From Developer Experience (DX) to User Experience (UX), Radix is supercharging the entire Web3 stack. Making every element 10x better sums up to a big number; a really, really big number… Web3 began with Bitcoin when the vapor of Austrian economics condensed into an engineering solution. Ethereum gave birth to decentralized finance (DeFi) and with every 10x improvement in speed, transaction costs, and security, even more use cases are annexed from TradFi into the decentralized future. However, as Web3 speed-runs its evolution, scalability problems have led to Layer-1 (L1) nation states becoming Layer-2 (L2) city states, with all of the economic and social friction that entails. One lesson from history is that trade is ruthlessly unsentimental - it migrates to wherever it is most welcome. In the 11th century, Florence was a pastoral backwater, but by 10x-ing its law, money and network it grew 25x in size and magnitudes more in wealth. Web3 is no different in principle and moves at 1000x the speed, as illustrated by blur.io (http://blur.io) unseating opensea.io (http://opensea.io) in three months by 10x-ing its fee structure, user experience (UX), and liquidity. In a few weeks’ time, [edit: Babylon (/contents/tech/releases/radix-mainnet-babylon) is now live!] Radix will storm the citadels of Web3 with a network, language and constitution that, if successful, is again braced to turn shepherds into kings. Every one of Radix’s innovations is a 10x moment in itself, but compounded together they become an irresistible magnet for commerce and a candidate for the dominant Web3 protocol. Exit Florence, enter Babylon… 1. Babylon Babylon. Midjourney. Let’s begin with a 100,000x moment. Radix’s Babylon (/contents/tech/releases/radix-mainnet-babylon) release will introduce several innovations that will combine to dramatically improve the utility and UX of Web3. - Native Assets (https://learn.radixdlt.com/article/what-are-native-assets) will 10x DX by allowing developers to work with assets intuitively at the platform level rather than via smart contracts. - Smart Accounts (/contents/tech/core-protocols/smart-accounts) will 10x DeFi security by enabling on-chain multi-factor authentication and gated accounts. - The Radix Wallet (/contents/tech/core-protocols/radix-wallet) will 10x UX with encrypted connections to DeFi apps. - Personas (/contents/tech/core-protocols/personas) will 10x privacy with customizable data sharing. - Transaction Manifests (/contents/tech/core-protocols/transaction-manifests) will 10x security and UX with clear summaries of every transaction. Native assets are especially transformative. Without them, Ethereum is stuck with a laborious peer-review process to agree standards on innovations like rental NFTs (https://eips.ethereum.org/EIPS/eip-4907) or soulbound tokens (https://eips.ethereum.org/EIPS/eip-5192) . These processes can take months. In contrast, adding access rules to tokens is trivial on Radix, requiring only single lines of code: Rental NFTs: .recallable(rule!(require(some_badge_address)), LOCKED) Soulbound tokens: .restrict_withdraw(rule!(deny_all), LOCKED) The following video takes a deeper dive into Radix’s implementation of soulbound tokens: 2. The Radix Engine The Radix Engine. Midjourney. Native assets on Radix live in an application and execution environment called the Radix Engine (/contents/tech/core-protocols/radix-engine) , so-called because it acts like a physics / game engine that enforces asset and transaction rules by default. Just as Unreal Engine (https://www.unrealengine.com/) has transformed the gaming and VFX industries, the Radix Engine is a 10x for DeFi developers, who are free to build complex applications without worrying about basic logic and safety considerations. 3. Scrypto Scrypto. Midjourney. Scrypto (/contents/tech/core-protocols/scrypto-programming-language) is a secure smart-contract language written in Rust and designed specifically for DeFi. Working with the Radix Engine, it reduces asset operations to single lines of code, making it possible to build complex decentralized applications (dApps) like Uniswap in 155 lines (https://github.com/radixdlt/scrypto-examples/tree/main/defi/radiswap) or others like chess (https://www.radixdlt.com/post/scrypto-is-as-smooth-as-vegan-butter) that are barely feasible (https://medium.com/@graycoding/lessons-learned-from-making-a-chess-game-for-ethereum-6917c01178b6) on Ethereum. Data suggests that there are 14x more Rust developers than Solidity developers worldwide, and Github hosts ~29x more repos in Rust than Solidity. By building on Rust’s existing ecosystem and developer community, Scrypto significantly improves the DX for existing developers and reduces the barriers to entry for those new to DeFi. Rust Solidity Multiple Developers 2.8m (https://slashdata-website-cms.s3.amazonaws.com/sample_reports/dsIe6JlZge_KsHWt.pdf) ~200k (https://blog.chain.link/smartcon-2022-developer-announcements/)(https://medium.com/electric-capital/electric-capital-developer-report-2021-f37874efea6d) 14x Github repositories 87k (https://github.com/search?q=language%3ARust&type=Repositories&ref=advsearch&l=Rust&l=) (Dec 2023) 3k (https://github.com/search?q=language%3ASolidity&type=Repositories&ref=advsearch&l=Solidity&l=) (Dec 2023) 29x The following video takes a deeper dive into how Scrypto 10x’s DX: 4. Cerberus Cerberus. Midjourney. Radix’s founder Dan Hughes (/community/dan-hughes) spent seven years testing the limits of distributed ledger technology (/contents/history/history-of-radix) before combining a pre-sharded (/contents/tech/core-concepts/sharding) architecture with the Tempo (/contents/tech/research/tempo-consensus-mechanism) consensus mechanism to achieve 1.4m transactions per second (tps) (https://www.radixdlt.com/blog/replaying-bitcoin) in 2019. Following Tempo, Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) enables cross-shard parallel processing, meaning that unrelated transactions don’t have to wait in a global queue to be processed. Multi-threading opens the door to linear scaling where throughput is limited only by the number and power of validators. Cross-shard processing will be fully utilized by Radix’s Xi’an (/contents/tech/releases/radix-mainnet-xian) release in 2024. In the meantime, Web3 is waking up to Cerberus (https://www.tari.com/updates/2022-09-22-update-89) . 5. Atomic Composability Atomic Composability. Midjourney. Atomic composability (/contents/tech/core-concepts/atomic-composability) is the ability for multiple operations to be batched and executed as a single transaction. It is essential for DeFi (https://twitter.com/PiersRidyard/status/1583013758620008449) and one of Radix’s core concepts (/contents/tech) . While L2 solutions like Zero-Knowledge (ZK) rollups are prime candidates to 10-100x the throughput and security of Ethereum, their sequencing time and trust boundaries (/contents/tech/core-concepts/trust-boundary) impede the ability to atomically compose transactions. In contrast, Radix operates within a single trust boundary and on the release of Xi’an (/contents/tech/releases/radix-mainnet-xian) will atomically ‘braid’ substates across shards (/contents/tech/core-concepts/sharding) , easily exceeding the throughput of rollups, without their limitations. 6. Sharding / Linear Scaling Sharding. Midjourney. Now for a 2^256x moment. In its 2024 Xi’an (/contents/tech/releases/radix-mainnet-xian) release, Radix will upgrade its state model to 2^256 shards (/contents/tech/core-concepts/sharding) . Compared to the single-chain or dynamic sharding solutions of other L1s, pre-sharding delivers several significant benefits: - Keeps fees low by enabling the validator set to expand at times of high usage. - Enables transaction constituents (substates) to be mapped deterministically, reducing lookup times. - Guarantees that unrelated transactions will always use separate shards, enabling massive parallelization. - Guarantees that transactions are randomly distributed across shards (and hence that validators are randomly selected according to shard coverage). - Allows validators to dynamically adjust their shard coverage according to processing power. - Makes it viable for low-power IoT devices and mobile phones to participate in validation. The main benefit of sharding on Radix can be summarized as linear scalability; making transaction capacity a function of validator count. A second order effect is that miniscule shards allow anyone to participate in validation, which is at least a 10x for decentralization, with positive implications for network security and wealth distribution. 7. Low Fees Low Fees. Midjourney. It is hard to overstate the importance of low transaction fees to a healthy market economy. By penalizing transactions, high fees prevent accurate price discovery, leading to poor decisions and the misallocation of resources - a kind of economic dementia. Contrarily, intelligence is the ability to error-correct - the lower transaction fees are, the more accurately prices can signal the true cost of production / demand, and the more ‘intelligent’ an economy is. Fees on Radix are currently ~500x less than Ethereum and likely to decrease further after Xi’an (/contents/tech/releases/radix-mainnet-xian) when the current validator cap is lifted. 8. Delegated Proof of Stake Delegated Proof of Stake. Midjourney. Switching from Proof of Work (PoW) to Proof of Stake (PoS) made Ethereum 100x more energy efficient, 100x more economically efficient and 5x more secure. However, validators must stake a minimum of 32 $ETH (~$53k) to benefit from staking rewards. Third-party staking is available for less but comes with smart contract and slashing risks. Radix has a minimum stake of 90 $XRD (~$4) to deter spam but otherwise allows anyone to stake their funds to a validator to earn staking rewards. This system is known as Delegated Proof of Stake (dPoS) (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) and is a 10x for network participation and distribution. 9. Burn the Bridges! Burn the Bridges. Midjourney. Prior to Radix, no protocol has had the capacity to unite Web3 as TCP/IP (https://en.wikipedia.org/wiki/Internet_protocol_suite) united Web1. High demand on Ethereum block space led to high fees, which drove users and liquidity providers to deploy their capital to newer L1s, sidechains, rollups and other L2 scaling solutions. Because of the trust boundaries (/contents/tech/core-concepts/trust-boundary) of blockchains and other digital ledgers, bridging to these alternatives usually requires assets to be locked in a smart contract on the original chain and newly minted on the other. This has had several deleterious effects on the growth of Web3: - Bridging is poor UX, costing time and fees. - Spreading capital across separate ledgers fractures liquidity, leading to higher interest rates, higher spreads and lower velocity. - Bridges are vulnerable, accounting for ~50% of all DeFi exploits (https://twitter.com/tokenterminal/status/1582376876143968256) since September 2020. Being fast, linearly scalable and atomically composable, Radix eliminates the reasons for bridges entirely. 10. Simplicity Simplicity. Midjourney. By attempting to build a global financial system on a slow, single-threaded chain without native assets, Ethereum is having to resort to Byzantine solutions like Proto-Danksharding (https://www.eip4844.com/) to scale. Throw in competing ZK circuits and cross-chain bridges and eventually complexity itself becomes a drag on growth: every increase means fewer engineers, charging higher wages for sub-optimal solutions. If the complexity were a byproduct of new functionality then it might be justified, but the current stack is still grappling with the basic provisions of data availability and - by breaking atomic composability - actually reducing functionality rather than increasing it. History in the Making By the 16th century, despite issuing the World’s reserve currency, Florence grew insular and began to stagnate. Instead of adapting to competition from other states, the Grand Duke fixed the gold price. Unproductive Cain killed productive Abel and, true to form, commerce moved elsewhere. History shows that economic growth is not difficult, it just requires a commitment to reduce friction enough to entice capital away from its sunk costs and into a new ecosystem. With that, the virtuous relationship between liquidity and growth is fairly linear, shown by a $1m trade on the ~$5tn forex market being ~50x cheaper than one on the ~$100bn stock market. Marketing campaigns might help at the margins, but, fundamentally, commerce is a marriage of convenience, not looks. Characterized as a nation, Web3 is heir apparent: it has sound money, inviolable property rights, no defence expenditure, no immigration restrictions, and no national debt. All it lacks are lower operating costs - understood broadly as DX and UX - as well as transaction fees; in short, a ‘broadband moment’. On these measures, Radix has too many 10x’s to ignore, and is ready to make history. RADIX.wiki (http://RADIX.wiki) is a knowledge and community hub for the Radix ecosystem. Follow us on Twitter (https://twitter.com/RadixWiki) for updates. Contents (/contents) | Ecosystem (/ecosystem) | News | Jobs | Talent Pool ## Bayesian URL: https://radix.wiki/community/bayezien Updated: 2026-02-08 Curator @ RADIX.wiki (http://RADIX.wiki) Builder @ Caper.network (http://Caper.network) ## Token Unlock URL: https://radix.wiki/contents/history/token-unlock Updated: 2026-02-08 Summary: The Radix Token Unlock was a transition from the price-vesting of $XRD to freely transferrable tokens, enacted on September 15th, 2021. The Radix Token Unlock was a transition from the price-vesting of $XRD to freely transferrable tokens, enacted on September 15th, 2021. Background In August 2021 Radix proposed changing to an immediate full unlock (https://www.radixdlt.com/post/update-on-the-token-unlock-community-survey) of all tokens that were subject to a price-vesting mechanism, and conducted community surveys (https://www.radixdlt.com/post/update-on-the-token-unlock-community-survey) to gauge sentiment on this proposal. Originally, a significant portion of the $XRD supply was subject to a price-based unlocking mechanism, whereby tokens would become available based on (e)XRD reaching certain price thresholds (https://www.radixdlt.com/post/community-survey-radix-token-unlock) over time. Community Surveys In the initial community survey, conducted from August 23-29, 2021, Radix asked participants via a single yes/no question whether all locked $XRD tokens should be unlocked immediately or remain on the existing unlock schedule. The survey required a verified Instapass account and email address to participate. Out of the responses, 66.8% voted in favor of the immediate unlock (https://www.radixdlt.com/post/results-unlock-survey-2) . Given the results of the first survey, Radix then announced a second survey weighted by the number of $XRD held in Radix wallet addresses linked to verified Instapass accounts. This second survey ran from September 1-5, 2021. Overall 72.34% of respondents (https://www.radixdlt.com/post/results-unlock-survey-2) , controlling 74.6% of the eligible $XRD supply, voted for the immediate unlock. Reasons for Unlocking There were several factors (https://www.radixdlt.com/post/update-on-the-token-unlock-community-survey) that motivated Radix to propose the immediate full unlock of all remaining locked $XRD tokens, despite the original staged price-based unlock schedule. First, within two weeks of the mainnet launch on July 28, 2021, over 48% of the circulating $XRD supply was already staked (https://www.radixdlt.com/post/token-unlock-september-15th) across more than 130 validator nodes, indicating a high degree of decentralization and security. In addition, development was progressing rapidly on the Alexandria network upgrade, which the team felt would benefit from the simplified tokenomics (https://www.radixdlt.com/post/update-on-the-token-unlock-community-survey) and further distribution enabled by the unlock. Finally, with the majority of survey respondents in favor, RDX Works agreed that fully unlocking the tokens was one of the most effective ways to accelerate overall adoption (https://www.radixdlt.com/post/token-unlock-september-15th) and decentralization of the network in line with their long-term goals. The Unlock Radix proceeded with the final unlock of all remaining locked $XRD tokens on September 15, 2021 (https://www.radixdlt.com/post/final-token-unlock-complete) , with the process completing between 14:00-17:00 UTC. At the time of the unlock, the tokenomics of $XRD were: - 9.6 billion $XRD circulating supply. - 12 billion $XRD total supply. - 24 billion $XRD maximum supply to be reached via an annual emission rate over at least 40 years - ~48% of the supply locked via staking - 15,000 $XRD in network fees had been burnt at the time of the final unlock (https://www.radixdlt.com/post/final-token-unlock-complete) . Impact The unlocking of all remaining $XRD tokens had an immediate impact on trading activity and network growth. In the first 24 hours after the unlock, over $65 million in daily trading volume of $XRD was recorded across exchanges (https://www.radixdlt.com/post/radix-report-16th-september) . The increased liquidity and simplified tokenomics contributed to continuing growth in awareness and staking participation on the network. By the next weekly update on September 16th, over 1.5 billion $XRD (https://www.radixdlt.com/post/radix-report-16th-september) - representing over 15% of the total supply - was staked across validator nodes. Radix also saw a surge in social media followers, up over 10% in the two weeks following the unlock (https://www.radixdlt.com/post/radix-report-16th-september) . This indicated an expanding community beyond just those holding unlocked tokens. ## Scrypto Developer Event URL: https://radix.wiki/contents/history/scrypto-developer-event Updated: 2026-02-08 Summary: The Scrypto Developer Event was a workshop held on the 9th of April, 2022, in Lisbon, Portugal. Media The Scrypto Developer Event was a workshop held on the 9th of April, 2022, in Lisbon, Portugal. Media https://www.youtube.com/watch?v=bo5LQW5hATQ (https://www.youtube.com/watch?v=bo5LQW5hATQ) ## Scrypto DeFi Challenge URL: https://radix.wiki/contents/history/scrypto-defi-challenge Updated: 2026-02-08 Summary: The Scrypto DeFi Challenge was a remote hackathon organized by RDX Works from February 20 - March 24, 2023. Media Scrypto DeFi Challenge The Scrypto DeFi Challenge was a remote hackathon organized by RDX Works from February 20 - March 24, 2023. Media Scrypto DeFi Challenge (https://scryptodefi.devpost.com/) ## Radix Wiki Hackathon #1 URL: https://radix.wiki/contents/history/radix-wiki-hackathon-1 Updated: 2026-02-08 Summary: The Radix Wiki Hackathon 1 was held at the ArtSect Gallery in London on the 20-21st April, 2024. TELEGRAM → The Radix Wiki Hackathon #1 was held at the ArtSect Gallery (https://maps.app.goo.gl/webQjmqK4RfKtv1dA) in London on the 20-21st April, 2024. TELEGRAM → https://t.me/RadixHackathon (https://t.me/RadixHackathon) https://youtu.be/DVxf9KfDMKk?si=CbPohwbDoEgY8PFQ (https://youtu.be/DVxf9KfDMKk?si=CbPohwbDoEgY8PFQ) Testimonials “I woke up having never programmed in rust before or ever dealing with the Radix ecosystem but somehow managed to deploy my first NFT contract in test-net a few hours ago. Coding in solidity feels like working in a strait jacket compared to scrypto! Never going back.” - Richard Jarram (https://www.linkedin.com/in/richardjarram/) , Geochain “Can’t believe we learnt Scrypto and built Smart Contract for InfiniX just in 2 days! Thanks to all the mentors and judges. It was an amazing experience to be there and learn from all of you!” - MD Abdur Razzak (https://www.linkedin.com/in/-abdur-razzak?miniProfileUrn=urn%3Ali%3Afs_miniProfile%3AACoAADmcRukBZBWcEc42TGiCbxkH1K0Tn9QAK0k&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_all%3BHakxM5kLQl6Ta%2FV2gp3MvQ%3D%3D) , InfiniX “The experience was both challenging and fulfilling. The concentrated learning over the course of two days surpassed what I had learned over several months and the sense of community and the cooperative spirit were outstanding.” - Jetnor Muhaj (https://www.linkedin.com/in/jetnormuhaj/) , AI Invest. Tracks - Each selected winner had to use at least one of the following resources in their project: Scrypto (https://github.com/radixdlt/radixdlt-scrypto) , the Radix dApp Toolkit (https://docs.radixdlt.com/docs/en/dapp-toolkit) , the Radix Engine ToolKit (/developers/legacy-docs/integrate/radix-engine-toolkit/radix-engine-toolkit) (TS, python, swift, C#, Kotlin) the Network API (https://docs.radixdlt.com/docs/network-apis) (Core API, Gateway API, System API), the Radix Off Ledger Authentication (https://docs.radixdlt.com/docs/rola-radix-off-ledger-auth) (ROLA). TRACK TRACK RUNNER PRIZE POOL ($XRD) RESOURCES IDEAS 🛤️ ADVANCED: Domains, Data, Finance XRD Domains / RadixCharts (https://radixcharts.com) **🥇 $3000 🥈 $1500 🥉 $500 - Can also be submitted to the Scrypto challenges (https://www.radixdlt.com/blog/scrypto-yield-derivatives-challenge-is-live) .** | ● Radix Name Service SDK (https://github.com/radixnameservice/sdk) ●  Telegram bots tutorial (https://core.telegram.org/bots/tutorial) ●  Python telegram bot library (https://python-telegram-bot.org/) ● Ociswap Tokens API (https://api.ociswap.com/tokens) ● Ociswap Ignition Rewards API (https://api.ociswap.com/ignition) ● Scrypto Lending Challenge ~v0.4.0 (Current v1.1.1) (https://github.com/radixdlt/scrypto-challenges/tree/main/3-lending) ●  Scrypto Exchanges Challenge ~v0.3.0 (Current v1.1.1) (https://github.com/radixdlt/scrypto-challenges/tree/main/1-exchanges) ●  OrangeFinance docs (https://orange-finance.gitbook.io/orange-finance) ●  Banana Gun docs (https://docs.bananagun.io/) | 💡 Telegram clone that uses Domains for handshakes & account discovery. 💡 Sentiment analysis of domain transactions. 💡 Visualize top token holder accounts. 💡 Transaction root analysis tool: who is creating the most transactions. 💡 Telegram bot for stake and unstake actions. 💡 AI trading bot. 💡 Token rankings per USD transfer volumes. 💡 NFT trading statistics. 💡 An ‘Airnode’ plugin for first party oracles similar to API3 (https://www.api3.org) . 💡 Cross-shard DEX pool to eliminate trading hotspots. 💡 Automated liquidity management tool / OrangeFinance (https://app.orangefinance.io/arbitrum) . 💡  **Banana Gun (https://bananagun.io) style auto-sniping. 💡  Dopex / Stryke (https://app.dopex.io/clamm/WETH-USDC) style options exchange.** 💡 A lending protocol that uses future staking rewards as collateral. | | 🛤️ BEGINNERS: Front-end Toolkit / Tx Manifest / NFTs | Shardspace | **🥇 $3000 🥈 $1500 🥉 $500 - Can also be submitted to the Scrypto challenges (https://www.radixdlt.com/blog/scrypto-yield-derivatives-challenge-is-live) .** | ● Transaction Manifest (https://docs.radixdlt.com/docs/transaction-manifest) ● Rust Manifest Builder (/developers/legacy-docs/integrate/rust-libraries/rust-manifest-builder) ● Gumball Machine Frontend dApp (https://docs.radixdlt.com/docs/en/learning-to-run-the-gumball-machine-front-end-dapp) ● Simple token transfer (/developers/legacy-docs/build/build-dapps/dapp-transactions/transaction-examples/simple-token-transfer) ● Token Gating (Coingecko) (https://www.coingecko.com/learn/token-gating) ● Token gating tools (other networks) (https://www.alchemy.com/best/token-gating-tools) ● Instruct Manifest Builder (https://instruct.radixbillboard.com) ● Radix Desktop Tool (https://github.com/atlantis-l/Radix-Desktop-Tool) | 💡 Token launch sniping. 💡 Wallet following. 💡 Automated dollar-cost averaging. 💡 Conditional / time delayed transfers (e.g. after period of inactivity / birthday gifts). 💡 Publishing platform for Decentralized Science papers. 💡 NFT-gating for Discord groups. 💡 NFT collection gallery with rankings. | Results 🥇 Beginners’ Track Winner - 🗺️ Geochain https://github.com/metalogica/geochain (https://github.com/metalogica/geochain) Geochain securely tracks and manages shipments in real-time using Radix for a transparent and efficient supply chain. Team: Richard Jarram (https://www.linkedin.com/in/richardjarram/) , Mariana Oka (https://www.linkedin.com/in/mariana-oka/) Presentation: https://docs.google.com/presentation/d/1NHJWIptTSpNwz4_rQeq7qeKeIL7GeEcEXz9H9wb7EwI/edit#slide=id.g26f1ee450e0_0_322 (https://docs.google.com/presentation/d/1NHJWIptTSpNwz4_rQeq7qeKeIL7GeEcEXz9H9wb7EwI/edit#slide=id.g26f1ee450e0_0_322) https://youtu.be/mHk3oAZDGcs (https://youtu.be/mHk3oAZDGcs) 🥈 Beginners’ Track Silver - 📊 Radix Stats https://github.com/dcts/radixstats (https://github.com/dcts/radixstats) Radix Stats provides free and open token analytics for Radix, enhancing transparency and credibility. Team: Thomas Starzynski (https://www.linkedin.com/in/thomas-starzynski/) , Mauro Jose (https://www.linkedin.com/in/maurojose) RadixStats-slides.pdf (Radix%20Wiki%20Hackathon%20#1/RadixStats-slides.pdf) Radix Dashboard (https://radixstats.web.app/?resource=resource_rdx1t5xv44c0u99z096q00mv74emwmxwjw26m98lwlzq6ddlpe9f5cuc7s) https://youtu.be/1nRRscF6TPU (https://youtu.be/1nRRscF6TPU) 🥉 Beginners’ Track Bronze - 🕺 Jouna-Wyliometer https://github.com/jameswylie/jouna-wyliometer (https://github.com/jameswylie/jouna-wyliometer) The Jouna-Wyliometer is a price sentiment tool that uses a 3D avatar who, via the Astrolescent API, reacts according to the current $XRD price. Team: Jouna Lansman (https://twitter.com/JounaLansman) Wyliometer - Price Sentiment Tool (https://jouna-wyliometer.pages.dev/) https://youtu.be/vmYPUApLe8o (https://youtu.be/vmYPUApLe8o) 🥇 Advanced Track Winner - ∞ InfiniX https://github.com/AbdurRazzak01/RadixHack (https://github.com/AbdurRazzak01/RadixHack) InfiniX offers customizable loss limits, instant fund settlements, and parametric insurance integration for DeFi investments. Team: MD Abdur Razzak (https://www.linkedin.com/in/-abdur-razzak?miniProfileUrn=urn%3Ali%3Afs_miniProfile%3AACoAADmcRukBZBWcEc42TGiCbxkH1K0Tn9QAK0k&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_all%3BHakxM5kLQl6Ta%2FV2gp3MvQ%3D%3D) , Kishan Marsonia (https://www.linkedin.com/in/kishan-marsonia) InfiniX (https://www.canva.com/design/DAGDCJiRn0g/jvYaaVqe4WQasc20wUjRYw/edit?utm_content=DAGDCJiRn0g&utm_campaign=designshare&utm_medium=link2&utm_source=sharebutton) https://youtu.be/zNV2ufQnZTQ (https://youtu.be/zNV2ufQnZTQ) 🥈 Advanced Track Silver - 🤖 AI Invest https://github.com/muhaj/aiin (https://github.com/muhaj/aiin) AI Invest provides an accessible chatbot interface with real-time stock and crypto data, guided purchases, and engaging UI for simplified trading. Team: Jetnor Muhaj (https://www.linkedin.com/in/jetnormuhaj/) AI Invest Radix Chat bot (https://www.canva.com/design/DAGCX1Ol78M/MXBVmGt_t_ucxpUkPbISPw/view?utm_content=DAGCX1Ol78M&utm_campaign=designshare&utm_medium=link&utm_source=editor) https://youtu.be/idkv0eHkS5g (https://youtu.be/idkv0eHkS5g) 🥉 Advanced Track Bronze - 🥊 Top Scorer / FitClash https://github.com/fitclash/fitclashdemo (https://github.com/fitclash/fitclashdemo) Top Scorer / FitClash is a fitness app that allows friends to wager each other with $XRD in various physical activities with real-time AI tracking and leaderboards. Team: Justin Grierson (https://www.linkedin.com/in/justin-grierson) https://youtu.be/VebCu-UC3G4 (https://youtu.be/VebCu-UC3G4) Advanced Track Runner Up - 🎨 PiciX https://github.com/AbdurRazzak01/Picix (https://github.com/AbdurRazzak01/Picix) PiciX is a daily AI art challenge inspired by Wordle, where participants create image prompts to match a provided AI-generated image, with the winning creation minted as an NFT and sent to the winner's Radix wallet. Team: MD Abdur Razzak (https://www.linkedin.com/in/-abdur-razzak?miniProfileUrn=urn%3Ali%3Afs_miniProfile%3AACoAADmcRukBZBWcEc42TGiCbxkH1K0Tn9QAK0k&lipi=urn%3Ali%3Apage%3Ad_flagship3_search_srp_all%3BHakxM5kLQl6Ta%2FV2gp3MvQ%3D%3D) , Kishan Marsonia (https://www.linkedin.com/in/kishan-marsonia) Data Engineer Eli Lilly & Company (https://www.canva.com/design/DAGDD_xD-Gg/tGjN5rHKAb7rjzud4QyFSw/edit?utm_content=DAGDD_xD-Gg&utm_campaign=designshare&utm_medium=link2&utm_source=sharebutton) https://youtu.be/DHbISlKdOIc (https://youtu.be/DHbISlKdOIc) ## Radix Team Hackathon URL: https://radix.wiki/contents/history/radix-team-hackathon Updated: 2026-02-08 Summary: The Radix Team Hackathon was the first internal event of its kind organized by RDX Works. The event was held during the first week of April, 2019, in the Peak D The Radix Team Hackathon was the first internal event of its kind organized by RDX Works. The event was held during the first week of April, 2019, in the Peak District region of England. Media https://www.radixdlt.com/blog/the-first-radix-team-hackaton (https://www.radixdlt.com/blog/the-first-radix-team-hackaton) ## Pillar Unconference URL: https://radix.wiki/contents/history/pillar-unconference Updated: 2026-02-08 Summary: 2018-07-20 - Pillar Unconference Vilnius, Lithuania !DgE8dO0WkAIAVAv.jpegPillar%20Unconference/f7f3f184-a8de- 2018-07-20 - Pillar Unconference (https://medium.com/pillarproject/the-pillar-unconference-2018-f766375b4832) , Vilnius, Lithuania ## Pillar Hackathon #1 URL: https://radix.wiki/contents/history/pillar-hackathon-1 Updated: 2026-02-08 Summary: !1GTZST8zdBZVafnNNDRIug.webpPillar%20Hackathon%201/1GTZST8zdBZVafnNNDRIug.webp !DxN4FakWoAIYGe.jpegPillar%20Hackathon%201/Dx https://youtu.be/hLrbQAmWV0w (https://youtu.be/hLrbQAmWV0w) Media https://medium.com/pillarproject/after-the-pillar-hackathon-1-a0c74ef8c0f (https://medium.com/pillarproject/after-the-pillar-hackathon-1-a0c74ef8c0f) https://www.pillar.fi/blog/after-the-pillar-hackathon-1/ (https://www.pillar.fi/blog/after-the-pillar-hackathon-1/) ## FooHack URL: https://radix.wiki/contents/history/foohack Updated: 2026-02-08 Summary: FooHack was a hackathon organized by RDX Works in collaboration with the musician RedFoo on the 14th of June, 2022. Media FooHack was a hackathon organized by RDX Works in collaboration with the musician RedFoo (https://en.wikipedia.org/wiki/Redfoo) on the 14th of June, 2022. Media https://youtu.be/EtRAt3_CHks (https://youtu.be/EtRAt3_CHks) https://www.radixdlt.com/blog/scrypto-is-as-smooth-as-vegan-butter (https://www.radixdlt.com/blog/scrypto-is-as-smooth-as-vegan-butter) ## Flexathon URL: https://radix.wiki/contents/history/flexathon Updated: 2026-02-08 Summary: Flexathon /‘flɛksəθɔn/ was a 3-week hackathon project by Dan Hughes to demonstrat Flexathon [ /‘flɛksəθɔn/ ] was a 3-week hackathon project (https://docs.google.com/document/d/1HkywxibycsVjq5e63_ZgyRycBRI3Myya3xLC56tVOP8/edit#heading=h.1lj0fweix9i8) by Dan Hughes to demonstrate innovations the Radix team had been pioneering in areas like consensus algorithms and sharding. Radix Founder Demonstrates Cross-Shard Atomic Composability (https://youtu.be/nXhABv1B9lk?si=p_l9Wvs6AmX2WjR3&t=1500) Radix Founder Demonstrates Cross-Shard Atomic Composability Timeline and Duration Hughes dedicated his vacation time in November-December 2020 to build out a functioning prototype sharded network exhibiting features like atomic composability. Hughes provided updates and documentation of his progress via Twitter (https://twitter.com/search?q=%40fuserleer%20%23flexathon&src=recent_search_click&f=live) and GitHub. Goals The Flexathon project was designed to showcase the capabilities enabled by the Radix team's Cerberus consensus protocol within a sharded environment. Specifically, Hughes set out to build a working implementation that exhibited atomicity, composability and decentralization across shards. The ambitious timeline was aimed to dispel skepticism and doubts about the feasibility of the Radix innovations. Implementation The implementation consisted of a mix of old and newly developed components. Code was repurposed from the Radix core, TEMPO/CAST modules developed previously, Hughes' own R&D framework as well as freshly written code to meet the hackathon requirements. The prototype was not meant to resemble future Radix public networks, merely demonstrate Cerberus fundamentals. Components Over the three weeks, Hughes built out and demonstrated capabilities including: - Network connectivity between nodes - Sending 'atoms' across the network - Atom pool voting for consensus - Early versions of features like block production and proof-of-work algorithms Impact The Flexathon prototype served as a successful proof-of-concept for the Radix team's innovations and helped validate the potential of the Cerberus protocol. By showcasing key capabilities like cross-shard atomicity and decentralized composability, Hughes managed to garner interest in Radix's technology vision. However, the prototype itself was not intended to be a direct precursor to later Radix public networks and substantial production hardening would be required. The piecemeal codebase, while helpful for rapid prototyping, did not meet the reliability and security standards demanded of live DLT networks. Nonetheless, the fundamental atomic composability exhibited during the demo laid the foundations for Radix's subsequent Olympia release leading up to the Alexandria mainnet. Components pioneered during Flexathon found use in further R&D by Hughes and feeds into the roadmap for the Radix protocol moving forward. Media https://docs.google.com/document/d/1HkywxibycsVjq5e63_ZgyRycBRI3Myya3xLC56tVOP8/edit?tab=t.0#heading=h.1lj0fweix9i8 (https://docs.google.com/document/d/1HkywxibycsVjq5e63_ZgyRycBRI3Myya3xLC56tVOP8/edit?tab=t.0#heading=h.1lj0fweix9i8) ## European Blockchain Convention, 2024 URL: https://radix.wiki/contents/history/european-blockchain-convention-2024 Updated: 2026-02-08 Summary: The European Blockchain Convention, 2024, was held in Barcelona, Spain, on the 25th of September. The event was sponsored by the Radix Foundation. Media !IMG552 The European Blockchain Convention, 2024, was held in Barcelona, Spain, on the 25th of September. The event was sponsored by the Radix Foundation. Media https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights) ## Austin Developer Event URL: https://radix.wiki/contents/history/austin-developer-event Updated: 2026-02-08 Summary: The Austin Developer Event was a workshop held on the 11th of June, 2022, within Consensus 2022 and organized by RDX Works. Media The Austin Developer Event was a workshop held on the 11th of June, 2022, within Consensus 2022 and organized by RDX Works. Media https://www.radixdlt.com/blog/austin-developer-event-consensus-2022 (https://www.radixdlt.com/blog/austin-developer-event-consensus-2022) ## Addix URL: https://radix.wiki/ecosystem/addix Updated: 2026-02-07 Summary: 💰 Assets:: ✅ Validator:: dashboard.radixdlt.com/network-staking/validator_rdx1swez5cqmw4d6tls0mcldehnfhpxge0mq7cmnypnjz909apqqjgx6n9 Status: 🟢 Addix is a community-led meme coin project built on the Radix (https://www.radixdlt.com/) blockchain platform. The project was conceived during a period of low sentiment and price (https://addix-xrd.gitbook.io/usdhit-on-radix) in the Radix ecosystem, with the intention of giving back to the community and creating a fun addition to the platform. Addix's native token, $HIT (https://ociswap.com/resource_rdx1t4v2jke9xkcrqra9sf3lzgpxwdr590npkt03vufty4pwuu205q03az/) , was created to serve as a means for "Radix Addicts" to "get their fix" within the ecosystem. Introduction The project was founded by Felix_xrd, a long-term Radix supporter who discovered the platform through Dan Hughes' Tempo whitepaper (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) on the Bitcointalk forum. Addix aims to differentiate itself from other meme coins by implementing novel features such as Rug-Proof smart contracts (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) and focusing on community engagement and development within the Radix ecosystem. Unlike many speculative cryptocurrency projects, Addix emphasizes that $HIT is not designed as a tool for speculation (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/usdhit-token) but rather as a fun and engaging way for community members to interact with the Radix ecosystem. The project's core values include transparency, community involvement, and contributing to the broader development of the Radix platform. As of 2024, Addix has implemented several key features, including a Rug-Proof staking mechanism (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) , partnerships with other Radix projects like FOMO, and ongoing community initiatives such as the HIT SQUAD (https://addix-xrd.gitbook.io/usdhit-on-radix/community-led/hit-squad) . The project continues to evolve, with plans for further development and integration within the Radix ecosystem. History Addix was founded by Felix_xrd (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) , a long-term Radix supporter and enthusiast. Felix's journey with Radix began when he discovered the Tempo whitepaper (https://www.radixdlt.com/blog/tempo-consensus-lessons-learned) written by Dan Hughes on the Bitcointalk forum. This discovery led Felix to become a dedicated Radix supporter, often referred to as a "Radix Maxi." The project was conceived during a period of low sentiment and price (https://addix-xrd.gitbook.io/usdhit-on-radix) in the Radix ecosystem, commonly referred to as an "ATL" (All-Time Low). The founders recognized that despite market conditions, there was a core group of supporters who remained committed to the Radix vision and community. This observation inspired the creation of Addix as a way to engage and reward these dedicated community members. Key milestones in Addix's history include: - Launch of the $HIT token (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/usdhit-token) : The project's native token was created with a total supply of 1,000,000,000,000 (1,000 B) tokens. - Initial Liquidity Provision (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) : On March 24, 2024, Addix provided initial liquidity exclusively through Ociswap, a decision made to avoid the risks associated with distributed liquidity at the early stages of the project. - Collaboration with DogeCube (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) : On March 26, 2024, Addix donated 12.7 B HIT to the DogeCube Basket Index, marking an early collaboration within the Radix ecosystem. - Implementation of Rug-Proof Staking (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) : In April 2024, Addix successfully tested and implemented its novel Rug-Proof staking mechanism, locking 127.5 B HIT tokens in a smart contract for future staking rewards. - Tokenomics Revision (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) : In May 2024, following community feedback, Addix revised its tokenomics, reducing founder allocations and dedicating more tokens to validator rewards. - Partnership with FOMO (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) : On June 9, 2024, Addix announced a partnership with FOMO to run a Radix Network Validator Node, further integrating the project into the Radix ecosystem. - Validator Node Success (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) : As of July 31, 2024, the ADDIX-FOMO Validator Node entered the Top 100 validators on the Radix network, marking a significant achievement for the project. Throughout its history, Addix has demonstrated a commitment to community feedback, transparent operations, and contributing to the broader Radix ecosystem. The project continues to evolve, with ongoing initiatives aimed at enhancing user engagement and adding value to the Radix platform. Project Overview Addix is a community-led meme coin project (https://addix-xrd.gitbook.io/usdhit-on-radix) built on the Radix blockchain platform. The project's name, "Addix," is a playful reference to the notion that supporters are "addicted" to the Radix vision, community, and the prospect of solving the blockchain trilemma. The primary goals of Addix include: - Giving back to the Radix community (https://addix-xrd.gitbook.io/usdhit-on-radix) : Addix aims to reward and engage community members who have made significant contributions to the Radix ecosystem. - Adding fun and engagement (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/usdhit-token) : As a meme coin, Addix is designed to be a fun addition to the Radix ecosystem, encouraging community interaction and engagement. - Promoting transparency and trust (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) : Through innovations like the Rug-Proof smart contracts, Addix aims to set new standards for transparency and trustworthiness in the cryptocurrency space. - Contributing to the Radix ecosystem (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) : Addix actively collaborates with other projects and contributes to the development of the Radix platform. The project's philosophy is rooted in the belief that community-driven initiatives can play a significant role in the growth and adoption of blockchain technology. Addix emphasizes that while it is a meme coin, it is not designed as a tool for speculation (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/usdhit-token) . Instead, it focuses on creating value through community engagement, innovative features, and ecosystem contributions. Addix operates with a strong emphasis on community feedback and involvement. This is evident in their willingness to revise tokenomics (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) based on community input and their ongoing search for community members to join the HIT SQUAD (https://addix-xrd.gitbook.io/usdhit-on-radix/community-led/hit-squad) . The project leverages various aspects of the Radix platform, including its native token creation capabilities (https://docs.radixdlt.com/docs/tokens) , smart contract functionality, and validator node system. By doing so, Addix not only benefits from the Radix infrastructure but also contributes to its ecosystem by demonstrating practical applications of these features. As with all cryptocurrency projects, Addix comes with risks, and the project explicitly disclaims (https://addix-xrd.gitbook.io/usdhit-on-radix) that nothing in their documentation should be considered financial advice. They encourage potential investors to conduct thorough research and exercise caution, reminding users that "it's just a $HIT Coin." Despite its playful nature, Addix aims to be a serious contributor to the Radix ecosystem, blending humor with genuine utility and community value. The project continues to evolve, with ongoing development and new initiatives aimed at enhancing its role within the Radix community. Technology 4.1 $HIT Token The $HIT token (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/usdhit-token) is the native cryptocurrency of the Addix project. Key features of the token include: - Total supply (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) : 1,000,000,000,000 (1,000 B) tokens. - Resource Address (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/usdhit-token) : resource_rdx1t4v2jke9xkcrqra9sf3lzgpxwdr590npkt03vufty4pwuu205q03az - No further minting or burning (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) capabilities. The token is designed to facilitate interactions within the Addix ecosystem, including staking and participation in community initiatives. 4.2 Rug-Proof Smart Contracts A key technological innovation of Addix is its Rug-Proof smart contract (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) for staking. This contract was developed by forking and modifying Astrolescent's Staking Contract (https://github.com/Astrolescent-Official/astrolescent-contracts-staking) , originally written by Timan. Key features of the Rug-Proof contract include: - Locking of staking rewards (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) in a contract that the founder cannot access. - Trustless distribution of rewards according to a predefined schedule. - Open-source code, encouraging transparency and adoption by other projects. The technical implementation involves: - Use of a OneResourcePool (https://github.com/nemster/astrolescent-contracts-staking/blob/main/src/lib.rs#L22) for user stakes. - A Vault for future rewards (https://github.com/nemster/astrolescent-contracts-staking/blob/main/src/lib.rs#L23) . - Restricted access (https://github.com/nemster/astrolescent-contracts-staking/blob/main/src/lib.rs#L13) to the deposit_rewards function. 4.3 Validator Node Addix operates a Radix Network Validator Node (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) in partnership with FOMO. The node's specifications include: - Hosting: IONOS cloud server. - Security: Firewall management, unlimited traffic, Wildcard SSL certificate. - Data center compliance: ISO-27001. - Hardware specifications (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) : - CPU: Intel® Xeon® E3-1230 v6 4 cores x 3.5 GHz (3.9 GHz Turbo Boost). - RAM: 32 GB DDR4 ECC. - Storage: 480 GB SSD (2 × 480 GB SSD) Software RAID 1. The validator node operates with specific parameters (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) : - 100% fee. - 15% of Addix tokens (150 B) allocated for node delegator airdrops over four years. - 2/3 of generated XRD used for Addix buybacks. - 1/3 of generated XRD used for FOMO buybacks. This technology stack demonstrates Addix's commitment to security, transparency, and active participation in the Radix network's consensus mechanism. By leveraging Radix's infrastructure and adding novel features like the Rug-Proof contract, Addix aims to provide a secure and engaging platform for its community while contributing to the broader Radix ecosystem. Tokenomics The tokenomics of Addix have undergone revisions based on community feedback, demonstrating the project's commitment to transparency and community-driven decision-making. Total Supply (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) : 1,000,000,000,000 (1,000 B) $HIT tokens. Initial Token Distribution (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) (April 2024): Revised Token Distribution (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) (May 2024): The revision in May 2024 was made in response to community feedback (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) , particularly regarding the reserve allocation. Key changes included: - Reducing founder and co-founder shares from 5% each to 2.5% each. - Allocating the former reserve to validator rewards. - Splitting the founder allocation between founder and co-founder. Liquidity Provisions (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) : - Initial liquidity was provided exclusively through Ociswap on March 24, 2024. - A donation of 12.7 B HIT (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) was made to the DogeCube Basket Index on March 26, 2024. - On May 27, 2024, 50 B HIT was added to a single-sided Liquidity Pool on DefiPlaza (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) to increase liquidity provision. Liquidity Strategies (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) : - The project has experimented with OciSwap "Splash as a Service" for liquidity pool rewards. - Plans to use RadLock for locking liquidity provider (LP) receipts once available. - Considering pulling out initial Founder LP provision on OciSwap and locking the remaining LP Receipt. - Intending to add substantial liquidity to DefiPlaza's single-sided Liquidity Pools and subsequently lock these LP Receipts. Founder Allocations (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/founder-allocations) : - Reduced from 5% each for Founder and Co-Founder to 2.5% each. - Plans to lock these allocations using RadLock once it becomes available. - Considering time-lock (e.g., 4 years) or value-lock (e.g., when Addix market cap reaches $100m) mechanisms. These tokenomics demonstrate Addix's commitment to fair distribution, community involvement, and long-term sustainability. The project's willingness to adjust its token allocation based on community feedback highlights its adaptability and focus on community-driven development within the Radix ecosystem. Features and Services 6.1 Rug-Proof Staking One of Addix's key innovations is its Rug-Proof staking mechanism (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) , which was implemented in April 2024. This feature aims to provide a trustless and transparent way for token holders to stake their $HIT tokens and earn rewards. Key aspects of the Rug-Proof staking include: - 12.75% of the total $HIT supply (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) (127.5 B tokens) locked in a Rug-Proof smart contract. - Rewards distributed according to a predefined schedule (https://addix-xrd.gitbook.io/usdhit-on-radix/rug-proof-emission-schedule) . - Trustless distribution (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) of rewards, preventing the founder from accessing the locked tokens. The staking mechanism works similarly to Radix's native staking system (https://docs.radixdlt.com/docs/staking-concepts) , using a concept of Liquid Staked Units (LSUs). When users stake $HIT, they receive StHIT tokens, which represent their stake in the pool. As rewards accumulate, the value of StHIT relative to $HIT increases, allowing stakers to benefit from their participation. Staking rewards (https://addix-xrd.gitbook.io/usdhit-on-radix/rug-proof-emission-schedule) are distributed according to the following annual schedule: - Year 1: 5% - Year 2: 2.5% - Year 3: 1.25% - Year 4: 1.25% - Years 5-8: Decreasing percentages from 0.75% to 0.625% 6.2 Validator Node Rewards Addix operates a Radix Network Validator Node (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) in partnership with FOMO, which entered the Top 100 validators on July 31, 2024. This node plays a crucial role in the Radix network's consensus mechanism and provides an additional avenue for community participation and rewards. Key features of the Validator Node rewards include: - 15% of the total $HIT supply (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) (150 B tokens) allocated for validator rewards. - Rewards locked in a Rug-Proof smart contract (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) . - Emission schedule (https://addix-xrd.gitbook.io/usdhit-on-radix/rug-proof-emission-schedule) starting from July 26, 2024. The validator node operates with the following parameters: - 100% fee (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) . - 2/3 of generated XRD (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) used for Addix buybacks. - 1/3 of generated XRD (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) used for FOMO buybacks. - Additional FOMO token airdrops (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) for delegators upon successful entry into the top 100 validators. The validator rewards emission schedule (https://addix-xrd.gitbook.io/usdhit-on-radix/rug-proof-emission-schedule) for the first year is as follows: - Months 1-3: 20 B - Months 4-6: 15 B - Months 7-9: 12.5 B - Months 10-12: 12.5 B This dual reward system of Rug-Proof staking and validator node participation provides multiple ways for the Addix community to engage with the project and earn rewards. Both mechanisms are designed with transparency and community trust in mind, aligning with Addix's overall mission of contributing positively to the Radix ecosystem. Team and Collaborations The Addix project is built on a foundation of collaboration and community involvement, with a core team and various partnerships contributing to its development and growth within the Radix ecosystem. Core Team: - Founder: Felix (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Known as Felix_xrd. - Long-term Radix supporter and enthusiast. - Discovered Radix through Dan Hughes' Tempo whitepaper (https://www.radixdlt.com/blog/tempo-consensus-lessons-learned) on the Bitcointalk forum. - Described as a "Radix Maxi" due to his strong belief in the platform. - Co-Founder: Haseeb (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Experienced developer with a passion for coding. - Previously contributed to the Dexter project within the Radix ecosystem. - Brings technical expertise to the Addix project. Key Collaborators: - Validator Partner: Team FOMO (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Led by FOMO Jim - Collaborated on Staking and Airdrop Component - Joint operation of the ADDIX-FOMO Validator Node (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) - Technical Collaborator: Don Marco (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Contributed to the development of the Rug-Proof smart contract (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) - Forked and modified Astrolescent's Staking Contract for Addix's use - Art Collaborator: Endry Didier (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Contributes to the visual aspects of the Addix project - Marketing Collaborator: Team $NOW (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Assists with marketing strategies for Addix Other collaborations: - Bot builders (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Radix Degens (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) - Helpful Coders (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) The Addix team emphasizes the importance of decentralization in their approach to collaboration. They bring in collaborators and consult advisors (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) for specialized input as needed, reflecting their commitment to leveraging diverse expertise within the community. Community Involvement: Addix actively seeks community participation in various aspects of the project: - HIT SQUAD (https://addix-xrd.gitbook.io/usdhit-on-radix/community-led/hit-squad) : An initiative to recruit 1-2 outstanding community members as ambassadors, with the potential to join the team. - Ongoing search for collaborators (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) in different areas of expertise. The project's collaborative approach extends beyond its own boundaries, with a stated goal of contributing to the Radix ecosystem and community as a whole (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) . This is evident in their partnerships, open-source contributions, and active participation in the Radix validator network. By fostering these collaborations and maintaining an open approach to community involvement, Addix aims to position itself as an integral part of the Radix ecosystem while promoting the growth and adoption of the platform as a whole. Community Initiatives Addix places a strong emphasis on community engagement and involvement, recognizing the importance of a vibrant and active user base in the success of a decentralized project. Several key initiatives highlight Addix's commitment to fostering a strong community: 8.1 HIT SQUAD The HIT SQUAD (https://addix-xrd.gitbook.io/usdhit-on-radix/community-led/hit-squad) is a prominent community-focused initiative by Addix. Its main features include: - Recruiting 1-2 outstanding community members as ambassadors. - Potential for ambassadors to join the Addix team. - Focus on highly engaged and active individuals. - Emphasis on contributing to HIT's long-term success. This initiative demonstrates Addix's commitment to identifying and nurturing talent within its community, potentially creating a pipeline for future team members and project leaders. 8.2 Community-Driven Development Addix has shown a willingness to adapt based on community feedback, as evidenced by: - Revision of tokenomics (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) in May 2024 following community discussions. - Reduction of founder allocations from 5% to 2.5% each for founder and co-founder. - Introduction of a 15% allocation for validator rewards. These changes highlight Addix's responsiveness to community input and its commitment to transparent, community-driven development. 8.3 Open-Source Contributions Addix contributes to the broader Radix ecosystem through open-source development: - Rug-Proof smart contract (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) made available for other projects to adopt. - Collaboration on various community projects, including work with Team FOMO (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) on staking and airdrop components. 8.4 Community Engagement Events While specific details are not provided in the documentation, it's common for cryptocurrency projects like Addix to organize various community engagement events. These might include: - AMAs (Ask Me Anything) sessions (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/promotion-fun) with team members. - Community contests and giveaways. - Educational webinars about the Radix ecosystem and Addix's role within it. 8.5 Validator Node Participation The ADDIX-FOMO Validator Node (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) serves as a community engagement tool by: - Allowing community members to delegate their XRD to the node. - Providing rewards to delegators, including both $HIT and FOMO token airdrops. - Engaging the community in the broader Radix ecosystem through active participation in network validation. 8.6 Promotion and Fun Initiatives Addix has allocated 5% of its total token supply (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) (50 B $HIT) for promotion and fun activities. While specific details are not provided, this allocation suggests a commitment to: - Organizing giveaways (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/promotion-fun) . - Encouraging meme creation (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/promotion-fun) . - Fostering community discussions (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/promotion-fun) . - Other engagement activities that align with Addix's identity as a meme coin. These community initiatives reflect Addix's understanding of the importance of an engaged and active community in the success of a decentralized project. By fostering open communication, encouraging participation, and providing opportunities for community members to contribute meaningfully to the project, Addix aims to build a strong, loyal user base within the Radix ecosystem. Future Plans and Roadmap While Addix does not provide a formal, detailed roadmap in the available documentation, several future plans and ongoing initiatives can be inferred from the project's communications and current activities: 9.1 Continued Development of Rug-Proof Mechanisms Addix has demonstrated a commitment to developing Rug-Proof smart contracts (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-usdhit-staking) and intends to continue this work: - Potential expansion of Rug-Proof concepts to other aspects of the project. - Encouraging adoption of Rug-Proof contracts by other projects in the Radix ecosystem. 9.2 Enhanced Liquidity Strategies The project has outlined plans to improve its liquidity provision: - Utilization of RadLock (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) for securing liquidity once it becomes available. - Consideration of pulling out initial Founder LP provision on OciSwap (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) and locking the remaining LP Receipt. - Plans to add substantial liquidity to DefiPlaza's single-sided Liquidity Pools (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/initial-liquidity) and subsequently lock these LP Receipts. 9.3 Expansion of Validator Node Operations Following the success of entering the Top 100 validators on July 31, 2024 (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/rug-proof-validator-node) , Addix may look to: - Increase the node's ranking within the validator set. - Explore additional partnerships or node operations to further decentralize the Radix network. 9.4 Community Growth Initiatives Addix has shown a strong focus on community engagement and may expand these efforts: - Continued recruitment for the HIT SQUAD (https://addix-xrd.gitbook.io/usdhit-on-radix/community-led/hit-squad) . - Development of new community-driven initiatives and governance models. - Expansion of promotional and fun activities using the allocated 5% of tokens (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/tokenomics) . 9.5 Ecosystem Contributions As part of its commitment to the Radix ecosystem, Addix may: - Continue to collaborate with other projects (https://addix-xrd.gitbook.io/usdhit-on-radix/usdhit-team/team-and-collaborations) in the Radix space. - Develop new tools or services that benefit the broader Radix community. - Participate in ecosystem-wide initiatives to promote Radix adoption. 9.6 Token Utility Expansion While specific plans are not detailed, it's common for cryptocurrency projects to continually explore new use cases for their tokens. Addix may look to: - Develop new features or services that utilize the $HIT token (https://addix-xrd.gitbook.io/usdhit-on-radix/proof-of-usdhit/usdhit-token) . - Explore partnerships that could increase the utility of $HIT within and potentially beyond the Radix ecosystem. 9.7 Governance Implementation As the project matures, Addix may consider implementing more formal governance structures: - Potential introduction of a DAO (Decentralized Autonomous Organization) for community-driven decision-making. - Development of voting mechanisms using $HIT or StHIT tokens. It's important to note that as a "living document," (https://addix-xrd.gitbook.io/usdhit-on-radix) Addix's plans and roadmap are subject to change based on community feedback, market conditions, and developments within the Radix ecosystem. The project emphasizes adaptability and community involvement in its decision-making processes, suggesting that future plans will likely evolve with input from the Addix community. ## RADIATOR URL: https://radix.wiki/ecosystem/radiator Updated: 2026-02-07 Summary: Animated Banner V1 (1).mp4.mp4) RADIATOR is an organization whose primary focus is on creating and managing the $RDT token, a Web3 cashback utility token that rewards users for participating in the Radix Network. The RADIATOR project establishes a cashback system and incentive structure for holders of the $RDT token, encouraging users to stake $XRD on their node and engage with the ecosystem's partners. Utility Incentives & Rewards RADIATOR and its partners offer various incentives and rewards to those who participate in their ecosystem. For example, when purchasing a cashback-eligible product, users receive not only $XRD cashback but also an $RDT reward based on the amount of spent $XRD. In addition, a portion of every paid incentive and reward is burned, helping to regulate the overall supply. Cashbacks $RDT's cashback system rewards holders with $XRD when they purchase or use specific products from partnering companies. This creates an incentive for users to support the Radix Network while benefiting from the rewards offered by the cashback system. As the first cashback utility token on Radix DLT, $RDT offers supporters airdrops for staking on the RADIATOR node or completing tasks under specific conditions (such as participating in giveaways). Stakers are also eligible for bonuses when completing tasks. Requirements During Olympia the requirements to be eligible for the cashbacks were mostly linked to a specific amount of $RDT on the wallet (with a holding duration over at least 7 days). These amounts variated between 150 and 666 $RDT. Tokenomics $RDT is RADIATOR’s cashback utility token. It aims to provide a reward ecosystem within the Radix Network through the implementation of a cashback system. Users can earn passive income by staking $XRD on the RADIATOR node and participating in the reward ecosystem. RADIATOR Tokenomics. $RDT Cycle (Market Mechanisms) The $RDT Cycle illustrates the equilibrium of the value via the enabled burn mechanism which is relative to the amount of distributed incentives. The Project itself also invests a percentage of profit - besides the regular Liquidity Providing - in a buyback of $RDT in order to create buy pressure. $RDT Cycle Burn Mechanism 25% of the total supply is reserved for pure Incentives. The equal amount of incentives will steadily be burned to provide an equilibrium of value for our token. Buyback Mechanism We have implemented a buyback system where a certain percentage of profit is used to buy back $RDT from the market in order to help creating some buypressure. After the 5% Liquidity Providing is finished the percentage of buyback will increase noticeably. Liquidity Provision We steadily provide own liquidity until we reach 5m $RDT which equals 5% Supply according to our Tokenomics Marketing & Development Funds allocated for marketing and development are used for exchange listing fees, marketing collaborations, and commissioned work, promoting the growth and adoption of the $RDT token. Founders + Vesting RADIATOR's founders have extensive experience in project management, customer relations, marketing, sales, and software development. Founder tokens are fully unlocked after three years, with distribution starting one year after the initial token listing date and at 25% intervals every six months. RADIATOR Vesting Schedule RADIATOR Node By staking $XRD on the RADIATOR node, users can earn passive income through staking rewards in $XRD and frequent $RDT airdrops. This encourages participation and investment in the RADIATOR ecosystem. The current nominal fee is 1.99% which equals an effective fee of 0.16% at 8% APY. 💡 **New Babylon Validator Address:** validator_rdx1s0k2g57gxld3nydl8535e2tkp6tduh6pmynd3mf2a3q428zh09n24s Staking Airdrops Delegators who stake $XRD on the RADIATOR node receive regular airdrops. Additional irregular airdrops may be available for special participation in the RADIATOR and partner ecosystem. RADIATOR Staking Airdrop Diagram Staking Airdrops are currently on a monthly decrease of 10.000 $RDT until we reach 100.000 $RDT in order to stabilize the price and reduce sell pressure but still attract people to stake on our node. These terms can be adjusted at any time by the team. ## Radix Desktop Tool URL: https://radix.wiki/ecosystem/radix-desktop-tool Updated: 2026-02-07 Summary: The Radix Desktop Tool is an open source utility project built by Radix community member HODL6666. The tool is under the MIT license. The Radix Desktop Tool is an open source utility project built by Radix community member HODL6666 (https://www.notion.so/Rust-Ecosystem-Ernest-Kissiedu-3f242c8c63e14e67be86070b912cd151?pvs=21) . The tool is under the MIT license. Features - Open Source: The code can be inspected on its Github page here. - Token Transfers: Single → Multiple, Multiple → Multiple. - QR Code Generation - Token Creation - Format Conversion: Bech32 Address ⇌ Hexadecimal String. - Asset Verification - Transaction History - Transaction Manifest Builder - Airdrop Tool: By entering the sending wallet’s private key the Airdrop Tool is able to send airdrops to tens of thousands of addresses at once, bypassing the limitation of the Radix Wallet, which can only send to 50-90 addresses at a time. “Because the private key is related to the security of funds, the tool must be open source and free, having people review my code builds trust, and run as Desktop App, not Web App.” - HODL6666 - Scrypto Package Deployer - Entity Verification - $XRD Faucet - Local Settings Versions v0.1.5 “This tool is intended for general users and developers. Clean and simple.” - HODL6666 · The first time to use, you can switch the network to Stokenet, and use Wallet Generate to generate new account address and private key. · Get test tokens via the XRD Faucet to try out all the features in this tool. Screenshots Token Transfer: Single to Multiple Token Transfer: Multiple to Multiple Transaction Manifest Builder Scrypto Package Deployer Token Creation Asset Verification Transaction History Entity Verification Bech32 Address ⇌ Hexadecimal String QR Code Generation $XRD Faucet Local Settings ## Weft Finance URL: https://radix.wiki/ecosystem/weft-finance Updated: 2026-02-07 Summary: Weft Finance is a decentralized lending and borrowing platform built on Radix. At the heart of Weft lies the creation and management of collateralized debt posi Weft Finance is a decentralized lending and borrowing platform built on Radix. At the heart of Weft lies the creation and management of collateralized debt positions (CDPs) represented by NFTs, known as ‘Wefties.’ History Weft was founded by two Ivory Coast engineers, Yetinin Coulibaly and Atoumbré Kouassi. Living in different continents, with Coulibaly in Paris and Kouassi in Abidjan, they came together with a shared vision for financial innovation and economic development in Africa. Their collaboration began in 2021 when both became active members of the Radix Community. Despite having never met, the two engineers discovered a shared vision for the future and a strong alignment in values. After meeting in person in Abidjan in fall 2022, they decided to form a partnership and build their first decentralized application (dApp) for lending and borrowing financial services. This led to the creation of Weft, aiming to harness the transformative potential of RadixDLT for the future of DeFi. The platform is now developed by a team of seven individuals, collectively known as Weft'ers. Features Weft offers several key features to facilitate lending and borrowing of digital assets: - Lending Pools: Weft utilizes lending pools that hold assets deposited by lenders. Each pool is designated for a specific asset type. Lenders receive deposit units representing their share of the pool. - Loan Units: Borrowed amounts are tracked using loan units, which reflect a borrower's share of debt obligation. Interest accrual is handled through the loan unit system. - Interest Rate Strategies: Interest rates are set dynamically based on lending pool usage through predefined interest rate strategies. This allows rates to adapt to market conditions. - Borrowing Power Delegation: Users can delegate their borrowing power by minting a linked "delegated Weftie" NFT and sending it to another user. The recipient can then borrow without needing collateral. - Liquidations: Loans with insufficient collateral can be liquidated by external entities or automatically. This sells collateral to repay loans and brings loan-to-value ratios back into a healthy range. - User Positions Operations: Weft supports position modifications like collateral swaps and direct loan repayment using collateral ("self-liquidation"). The system is designed to be flexible, secure, and capture the nuances of each loan through use of the non-fungible Wefties. Components Weft Finance relies on several key components to enable its lending and borrowing functionality: Lending Pools - Hold assets deposited by lenders and act as reserves for borrowers - Each pool is for a specific asset type (e.g. $XRD) - Mint deposit units to track lenders' shares - Manage accrued interest on loans - Provide flash loans Lending Market - Acts as the interface between lenders/borrowers and lending pools - Handles borrowing, repaying loans, liquidations, withdrawals - Ensures security through badges and access rules - Validates and extends Wefties to enable pool interactions Wefties - NFTs that contain users' collateral & loan positions - Secure metadata that stores deposit unit collateral amounts and loan unit borrowed amounts - Controls enforced by Lending Market badge $WEFT Token - Last resort protection against market volatility risks - Stakers can deposit $WEFT to mitigate potential insolvencies - Stakers earn a portion of collected fees as insurance premiums This modular architecture maximizes flexibility and security of lending operations. Operations Weft supports several key operations for lenders and borrowers: Lending - Users contribute assets to lending pools - They receive deposit units representing pool shares - Deposit units can be redeemed for assets Borrowing - Validates the Weftie NFT and extends it - Executes borrowing order through pool interactions - Performs health checks on loans and updates Weftie Interest Accrual - Interest rates set dynamically based on pool usage - Accrued by increasing total borrowed amount - Deposit and loan units remain constant Revenue - Collected from loan interest, flash loan fees, liquidation bonuses - Shared between operations costs and insurance module Insurance Module - Accepts staked WEFT tokens - Tokens sold to cover losses in extreme situations - Stakers earn portion of revenue as insurance premium Weft aims to make lending and borrowing as seamless as possible while keeping operations decentralized and secure. Roadmap Weft Finance has laid out a roadmap to drive the growth and adoption of its lending and borrowing platform: Q3 2023 - ✅ Launched validator node, distributing $WEFT tokens to stakers - ✅ Introduced Weft Alpha version on testnets for early feedback Q4 2023 - ✅ Initiated $WEFT token listing and liquidity mining incentives - ✅ Released Beta version on updated testnet after Babylon launch - ✅ Started auditing Scrypto code prior to mainnet - ✅ Launched mainnet MVP after completing audits - ✅ Introduced Weft staking and early adopter incentives 2024 - Planning interface upgrades and mobile app launch - Transitioning towards a DAO model by end of 2024 The roadmap focuses on iterative community-driven development, testing, and upgrades to eventually decentralize Weft into a DAO. Team Atoumbré Kouassi (https://www.notion.so/Atoumbr-Kouassi-87b1cf2adea44c5b8633cf125b0a02d7?pvs=21) With a Master's degree in Applied Statistics and Economy, Kouassi has had a successful career in support service management in Abidjan. A dedicated professional, Kouassi has been following and experimenting with blockchain technology since 2013. His search for an efficient distributed ledger technology led him to discover Radix DLT, which would become the foundation for Weft. Yetinin Coulibaly A master in Embedded Electronics and Industrial Computing, Coulibaly is a lead developer based in Paris. He has a strong entrepreneurial spirit, having instigated a Senegal-based startup that provided digital ticketing solutions for transportation and event sectors. Coulibaly's drive and interest in blockchain technology eventually led him to join the Radix Community and embark on the journey of co-founding Weft. Zivile Community Manager Penifana Frontend Developer Roland Backend Developer Maxence Project Manager Amadou PhD Advisor providing research guidance ## CaviarNine URL: https://radix.wiki/ecosystem/caviarnine Updated: 2026-02-07 Summary: CaviarNine is a comprehensive DeFi ecosystem offering a range of products and services. The CaviarNine suite of products is also referred to as the FLOOP DeFi E CaviarNine is a comprehensive DeFi ecosystem offering a range of products and services. The CaviarNine suite of products is also referred to as the FLOOP DeFi Ecosystem (https://blog.caviarnine.com/floop-tokenomics-part-1-72e2a4c1fc62) after its native $FLOOP token. Background The platform is owned and operated by CaviarNine Limited BVI (https://docs.caviarnine.com/introduction/terms-and-conditions) , a company incorporated in the British Virgin Islands. The CaviarNine ecosystem includes various services, such as The Aggregator and liquidity provisioning tools, developed by Caviar Labs (https://docs.caviarnine.com/introduction/terms-and-conditions) , a Singapore registered software development house and subsidiary of Invariance Pte Limited (https://www.linkedin.com/company/invarianceptelimited/about/) . Products Aggregator The Aggregator (https://docs.caviarnine.com/products-floop/aggregator) is a unified front-end gateway to all liquidity on the Radix network. Initially known as DSOR (http://DSOR.io) (Decentralised Smart Order Router), it has been revamped and rebranded under the CaviarNine umbrella. The Aggregator's primary function is to ensure traders get the best possible price by aggregating liquidity from various sources. It searches all known DeFi smart contracts for liquidity, ensuring users get the best trade across all liquidity sources on Radix. The Aggregator connects to DEXs, order books, token bridges, validators, and other liquidity sources, ensuring traders don't have to rely on a single DEX for the best price. Fee Structure A minimal fixed fee is charged by the Aggregator, which is waived when routing to any CaviarNine liquidity. All fees accrued are directed to the FLOOP Treasury, controlled by the FLOOP DAO. These fees can potentially be converted into $FLOOP and either burned or used for long-term liquidity rewards. Index Pools Index Pools are multi-token pools that allow for bespoke liquidity provision (https://blog.caviarnine.com/product-preview-index-pools-e126966e3dfd) , each characterized by different constituents, weights, and risk profiles. Pools can comprise between 2 and 9 distinct tokens. Fee Structure Index Pools incorporate a minor fixed fee component along with a dynamic liquidity fee meaning that in volatile market scenarios, liquidity providers receive increased fees as a buffer against impermanent loss. Liquidity providers (LPs) receive a portion of the liquidity fee proceeds. As per FLOOP Tokenomics part 1, the fixed fees are directed to the FLOOP Treasury, which is overseen by the FLOOP DAO. These fees might be converted into FLOOP and either burned or allocated for extended liquidity rewards. Order Book Order Book is low-fee market and limit order system, which stands as an alternative to CaviarNine’s standard Automated Market Maker (AMMs). The Order Book also provides options for immediate execution or cancelation on partial fills. While Order Books allow traders to specify exact prices for transactions, they don't guarantee order fulfillment. AMMs, on the other hand, guarantee trade execution but can vary in price based on pool liquidity and trade size. Fee Structure A nominal fee is charged by the Order Book and directed to the FLOOP Treasury, which is overseen by the FLOOP DAO. Shape Liquidity Shape Liquidity (https://docs.caviarnine.com/products-floop/shape-liquidity/overview) is an Automatic Market Maker (AMM) that allows liquidity providers (LPs) to focus their positions on active trading zones within specific price ranges, using a variety of concentrations. This is more capital efficient and enables LPs to generate higher fees from their available capital. Shape Liquidity provides various shapes for liquidity concentration, such as: - Constant Height Range: Maintains a steady concentration of liquidity over the desired price range. - Center-Peaked Curve: Liquidity peaks at the center of the range and tapers off towards the edges. - Bar-Bell Shape: Ideal for providers who anticipate significant trading activity around two specific regions. Fee Structure Shape Liquidity features both a fixed fee and a dynamic liquidity fee that auto-compounds to the underlying position, acting as a cushion against impermanent loss. Liquidity providers share the proceeds of the liquidity fee, while fixed fees are sent to the FLOOP Treasury. LSU Pool The LSU Pool (https://docs.caviarnine.com/products-floop/lsu-pool/lsu-pool-overview) is a multi-token pool for Radix’s Liquid Staking Units (LSUs) that represent shares of the staking pool. Each LSU is associated with specific validators, who may offer bespoke rewards for staking with them. The LSU Pool offers multiple ways for users to utilize their LSU tokens, such as moving between validators, facilitating Instant Unstaking, and providing additional yield to LSU Pool liquidity providers. Instant XRD Unstaking (https://docs.caviarnine.com/instant-xrd-unstaking/overview) allows users to send an LSU and receive XRD instantly, albeit at a slightly reduced amount compared to waiting the standard 7-day unstaking period. This feature provides flexibility and immediate liquidity to users. Fee Structure In line with FLOOP Tokenomics part 1, fixed fees are directed to the FLOOP Treasury, controlled by the FLOOP DAO. Perpetual $XRD Perpetual XRD (pXRD), is a Market-Driven Liquid Staking Derivative on Radix. Initially set at a 1:1 ratio with XRD, pXRD's future price is determined by an on-chain oracle that takes into account the yields of active validators. This ensures that pXRD's growth rate is in line with the performance of top-tier validators. Users can deposit eligible collateral into a vault, and pXRD is automatically minted against it. They can receive up to 95% of the value of their collateral as pXRD. The system offers high leverage opportunities due to the low volatility of pXRD and its eligible collateral. If the eligible collateral drops below a certain threshold (e.g., 102% of the pXRD value), liquidation occurs. Anyone can claim the collateral by supplying pXRD and earn the difference. The main types of collateral include XRD (Radix's native token), LSUs (any validator token), LSU-NFTs (claim token receipts obtained after unstaking from a validator), and LSU-LP (Liquidity Pool tokens from CaviarNine's LSU Pool). The consistent increase in Perpetual XRD's price offers a systematic, yield-based valuation process, eliminating sudden price-driven liquidations. Users can earn above-average yields, unlock the potential of their XRD, and benefit from a staked XRD derivative designed for DeFi protocols. The system also offers arbitrage opportunities and charges a nominal protocol fee. Perpetual XRD is perfect for those seeking stable growth, risk managers, and yield hunters. Fee Structure Similar to FLOOP Tokenomics part 1, the protocol fees are directed to the CAVIAR Treasury, which is overseen by the CAVIAR DAO. Radit Radit.io (http://Radit.io) was an on-chain messaging service but has been discontinued. Team The CaviarNine team (https://docs.caviarnine.com/introduction/team) is diverse and spread across Thailand, Singapore, Canada, and Australia, with its headquarters in Bangkok. The founders, Oliver Scott Simons (https://www.notion.so/Oliver-Scott-Simons-9d7ee5e8d6cd4a81a6e9aeac2e0fa003?pvs=21) (aka Tronn) and Chris Colman, have extensive backgrounds in finance and technology. Their mission is to provide users with seamless access to professional-grade innovative DeFi products and unlock the full potential of DeFi. Tokens CaviarNine has introduced two primary tokens: $FLOOP and $CAVIAR (https://docs.caviarnine.com/tokens/overview) . These tokens are associated with the platform's DeFi products and serve as utility and DAO tokens. They were minted prior to the Babylon upgrade and the start of dApps on Radix. Holders of these legacy tokens can bridge them to the Babylon (https://blog.caviarnine.com/floop-tokenomics-part-1-72e2a4c1fc62) versions that power CaviarNine's products. Floop ($FLOOP) Babylon $FLOOP tokens (https://blog.caviarnine.com/floop-tokenomics-part-2-effbfc3fb3bd) will power the expanding ecosystem and serve as governance tokens for the nascent FLOOP DAO. Users will have the option to transition their existing Olympia $FLOOP into Babylon FLOOP via a one-way bridge. Distribution Around 55% of the original $FLOOP tokens were distributed via airdrops. Following the Babylon upgrade, the distribution of the original $FLOOP tokens is anticipated as follows: - Core Team: 200 FLOOP (20%) - Company Reserve: 200 FLOOP (20%) - Airdrops to LPs active within the FLOOP ecosystem (5%) Caviar ($CAVIAR) Babylon $CAVIAR, akin to $FLOOP, will be a newly bridged token that will power future DeFi structured products that CaviarNine intends to develop. FLOOP DAO The FLOOP Decentralized Autonomous Organization (DAO) is the cornerstone of the vision for a decentralized and community-driven ecosystem. Babylon $FLOOP tokens will be the backbone of the DAO, enabling decentralized decision-making that shapes the ecosystem's direction. ## History of Radix URL: https://radix.wiki/contents/history/history-of-radix Updated: 2026-02-06 Radix has been developed through 6 major iterations since its inception in 2013, beginning with a blockchain model and culminating with a fixed shard space and Cerberus (https://www.notion.so/Cerberus-Consensus-Protocol-d5755bce114d42e781128ee6a5603dcd?pvs=21) . Timeline KEY DATES EVENT 1979-07 Daniel Patrick Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) was born in Stoke-on-Trent, England. 1984 Hughes’ father bought him a Spectrum Zx81 computer (https://www.forbes.com/sites/parmyolson/2019/01/09/this-hermitic-engineer-is-plotting-the-death-of-blockchain/) , which sparked his interest in coding. 2008-02-25 Hughes incorporated KDB Technology (https://find-and-update.company-information.service.gov.uk/company/06512975) , a mobile technology company. 2011 Having just sold his third software company (https://web.archive.org/web/20171211210150/https://www.radix.global/about) , Hughes discovered Bitcoin and began mining, creating test networks, and experimenting with RPC tools. He had previously developed software for NFC-based mobile payments. 2012 Hughes set up a private global Bitcoin network (https://web.archive.org/web/20171211210150/https://www.radix.global/about) with servers in Australia, Europe, and the USA. With block size limits removed, the network failed at 400–500 tps and ceased functioning at 700 tps. 2013 Hughes forked the Bitcoin code to achieve a maximum of 1000 transactions per second (tps). 2013-2016 Hughes developed eMunie (https://www.notion.so/eMunie-6ce15e33de9e4708b2da4a6e20b6ddc7?pvs=21) 2016-12-02 Hughes and Piers Ridyard incorporated Surematics (https://find-and-update.company-information.service.gov.uk/company/10508397) , decentralised deal-room software for insurance companies. 2017-07-13 RDX Works (https://find-and-update.company-information.service.gov.uk/company/10864928) incorporated. 2019-06-11 Tempo (https://www.notion.so/Tempo-Consensus-Mechanism-cc66e55fbb6f46a990ff7b9088318f39?pvs=21) achieves over 1.4m tps on 1187 nodes (https://www.radixdlt.com/blog/replaying-bitcoin) . 2019-07-16 Radix Foundation (https://find-and-update.company-information.service.gov.uk/company/12106715) incorporated. 2021-07-28 Radix ‘ Olympia (https://www.notion.so/8fa2123a247c4e60853253ca2ab3e40e?pvs=21) ’ Mainnet launched. 2021-12-15 Radix ‘ Alexandria (https://www.notion.so/ab78b3f478594b4d8905360c2fade517?pvs=21) ’ developer environment launched. 2023-09-28 Radix ‘ Babylon (https://www.notion.so/2cdc168d56e04787a9f412c23560c4b0?pvs=21) ’ Mainnet launched. Development History From 2013, Dan Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) began to experiment with various data structures and consensus mechanisms. Each iteration led to insights that have informed the current Radix architecture. Pre-Radix: Bitcoin Stress Testing (2012) Before beginning work on Radix, Hughes set up a private global Bitcoin network with servers in Australia, Europe, and the USA. He removed all block size limits and flooded the network with random transactions. At 400–500 transactions per second the network began to fail, and at 700 tps it ceased functioning entirely (https://web.archive.org/web/20171211210150/https://www.radix.global/about) . Hughes attributed this natural limit to CAP theorem (https://en.wikipedia.org/wiki/CAP_theorem) and the finite speed of information transfer across a global network. Radix Iteration 1: Blockchain (700-1000 tps) In 2013, Hughes forked the Bitcoin codebase and experimented with different permutations of block size, block times, and hardware. These experiments established that the practical limit of blockchain throughput is approximately 700-1000 tps. A simple comparison with Visa at 24,000 tps meant that blockchains would never work as a global payments rail. Further Reading - Why Blockchains Don’t Scale (https://www.radixdlt.com/post/blockchains-are-broken-here-is-why) eMunie Main Article: eMunie (https://www.notion.so/eMunie-6ce15e33de9e4708b2da4a6e20b6ddc7?pvs=21) eMunie was Hughes’ attempt to create a blockchain platform that would crack the trilemma of security, scalability, and speed. Although it consistently achieved 400 transactions per second and contained built-in mailing systems, chat rooms, and a proprietary marketplace, Hughes realized it still fell short. Radix Iteration 2: Blocktree (200 tps) The limits of monolithic blockchains led to Hughes’ first experiments with sharding (https://www.twitch.tv/fuserleer/clip/SpicyGloriousDillPipeHype-Fm0tjQDsKoXqRibF) in the form of a ‘blocktree’: a data structure of multiple, branching blockchains. At this stage, every branch had its own trust boundary (https://www.notion.so/Trust-Boundary-6e402121f2124625b781f8ac3751595c?pvs=21) , necessitating a complex messaging system to communicate between them. This led to the insight of grouping related transactions together into the same branch. Unfortunately, at around 200tps, any disagreements between branches led to an exponential rise in message complexity as the states reorganized. Radix Iteration 3: Directed Acyclic Graph (DAG) (1500 tps) Following on from projects like IOTA (https://www.iota.org) , Hughes started exploring Directed Acyclic Graphs (DAGs), which allow for parallel, asynchronous processing of transactions. This was also a move away from Proof-of-work (PoW) sybil protection. Ultimately, the lack of coordination between nodes made the DAG vulnerable to double-spend attacks. Further Reading - Why DAGs Don’t Scale Without Centralization (https://www.radixdlt.com/post/dags-dont-scale-without-centralization) Radix Iteration 4: Channeled Asynchronous State Tree (CAST) (2300 tps) In an effort to contain the message complexity of blocktrees, a CAST separates ledger state from data availability by hosting the former on a blocktree and the latter in a DAG. The way that CAST uses lightweight merkle representations of node votes has been retained (https://www.twitch.tv/fuserleer/clip/EphemeralVibrantBulgogiFrankerZ-IwQ75fU6T0uIdhAs) but, though an improvement on previous attempts, any network latency still led to an exponential rise in message complexity. Radix Iteration 5: Tempo (1.4m tps) Main article: Tempo (https://www.notion.so/Tempo-Consensus-Mechanism-cc66e55fbb6f46a990ff7b9088318f39?pvs=21) Tempo was the fifth iteration of code since 2013 and led to Hughes being accepted into Y Combinator in 2017. Around this time the project changed its name from eMunie to Radix. The key insight behind Tempo was the Logical Clock (https://web.archive.org/web/20171211210150/https://www.radix.global/about) : by giving every node a Logical Clock, a tamper-proof record of the relative ordering of all network events could be maintained as a continuous, asynchronous stream without periodic state updates. Further Reading - https://www.radixdlt.com/post/tempo-consensus-lessons-learned (https://www.radixdlt.com/post/tempo-consensus-lessons-learned) Radix Iteration 6: Cerberus (10m tps) Main article: Cerberus (https://www.notion.so/Cerberus-Consensus-Protocol-d5755bce114d42e781128ee6a5603dcd?pvs=21) Cerberus uses Tempo’s pre-defined shardspace concept but also builds on a number of well-proven cryptographic primitives, giving strong guarantees around safety, liveness and well-defined security bounds. This combines to create a unique BFT-style agreement process enabling scalability alongside security. Importantly for the Weak Atom problem, the security bounds of this can be well-proven and gives strong guarantees around safety and finality. Cerberus has been peer reviewed (https://escholarship.org/uc/item/6h427354) up to 10m tps on 1024 shards. Early Team (2017) By the end of 2017, the Radix team (https://web.archive.org/web/20171211210150/https://www.radix.global/about) comprised six members: - Dan Hughes — Chief Technology Officer. Prior to discovering Bitcoin in 2011, Hughes helped develop software for NFC-based mobile payments and had built, run, and exited three successful software startups. - Piers Ridyard — Chief Executive Officer. Began experimenting with insurance smart contracts on blockchain in early 2015. Co-founded Surematics (https://find-and-update.company-information.service.gov.uk/company/10508397) (YCombinator S’17), which built the world’s first decentralised dataroom on the Radix ledger. - Robert Olsen — Chief Operating Officer. A crypto investor and blockchain evangelist since 2012, handling operations, marketing, PR, and community communications. - Stephen Thornton — Research Scientist. Following a career in scientific research at Cambridge University, Thornton researched applications of Graph Theory and Special Relativity to the Radix network. - Marc Rubio — Developer. An Android developer with a master’s in electronic engineering, involved in the crypto community since 2013. - Zalán Blénessy — Developer Operations. Previously helped ST Ericsson create mobile operating systems and contracted for Apple on hyper-scale deployment. ## Z3US URL: https://radix.wiki/ecosystem/z3us Updated: 2026-02-06 Summary: Z3US is an open-source, community-centric web3 wallet designed for the Radix DLT (Distributed Ledger Technology) network1. The wallet is user experience (UX) dr Z3US is an open-source, community-centric web3 wallet designed for the Radix DLT (Distributed Ledger Technology) network 1 (https://z3us.com/) . The wallet is user experience (UX) driven, focusing on providing a streamlined interface for users to interact with decentralized finance (DeFi) applications and NFTs (Non-Fungible Tokens). Functionality Z3US provides functionalities such as managing accounts, sending and receiving tokens, staking tokens to receive rewards, and connecting to DApps (Decentralized Applications) directly from the browser wallet 1 (https://z3us.com/) . It also prides itself on offering state-of-the-art security features, including support for hardware wallets like Ledger. History The project was founded in 2021 and launched its Beta version in June of the same year. By August 2021, Z3US had integrated with decentralized exchanges (DEX). Roadmap The roadmap (https://z3us.com/roadmap) for Z3US includes plans for Wallet Babylon upgrades in Q1/Q2 2023, followed by the release of a Z3US iOS app in Q3/Q4 2023. Radix As for the Radix DLT network, it is a decentralized, secure, scalable, and fast platform where Z3US operates. ## XSEED URL: https://radix.wiki/ecosystem/xseed Updated: 2026-02-06 Summary: The Xseed native token is primarily associated with the Radix Public Network, a decentralized network that aims to provide a scalable and secure infrastructure The Xseed native token is primarily associated with the Radix Public Network, a decentralized network that aims to provide a scalable and secure infrastructure for various applications and ecosystems. Xseed tokens hold utility within the Radix ecosystem and the MetaXSeed Games platform. Overview The XSEED native token, also known as $XSEED, is a digital token associated with the Radix Public Network (https://www.radixdltstaking.com/xseed-token/) . It was established in 2021 and serves various purposes within the Radix ecosystem. Holding $XSEED ong-term allows users to earn staking rewards in the form of Radix tokens ($XRD), contributing to the network's security and decentralization. $XSEED has also been utilized as an ecosystem currency, streamlining transactions and fostering a seamless, decentralized economy. Additionally, the $XSEED token has been distributed to delegators of the Radix network, providing them with an opportunity to participate in the long-term economic success of the network. Mission The mission of $XSEED within the context of the Radix ecosystem, is to promote the " RadixDLT Staking (https://www.radixdltstaking.com/xseed-token/) " Validator Node and other projects in the Radix ecosystem1. By holding XSEED tokens, participants can contribute to the long-term economic success of the node and the overall network. The XSEED token distribution among delegators allows them to share in the rewards and benefits of participating in the Radix network. Additionally, the XSEED token serves as a medium of exchange within a gaming NFT marketplace, facilitating transactions within the marketplace. Token Economics All 10 million $XSEED tokens (https://www.radixdltstaking.com/xseed-token/) will be transferred to a trust and distributed in accordance with the following timetable: 5,000,000 (50%) Staking Reward: For each new stake we receive when using our RadixDLT Staking Validator Node, you will receive XSE (1:20). 600,000 (6%) Long-term Staking Rewards - To further encourage long-term staking on our Node, XSE-token will be regularly given out to our long-term delegators. 700,000 (7%) Used to pay for marketing efforts. 500'000 (5%) Team and Development - Until the Xi'an Release is put into effect (2024) these tokens will be frozen. 3,200,000 (32%) Future Projects Reserved - Although the purpose of these tokens is not yet clear, they will be used to increase the value of the XSE token. Benefits Joining the Xseed native token ecosystem can offer several benefits (https://www.radixdltstaking.com/xseed-token/) : Staking Rewards By holding and staking $XSEED, you can earn staking rewards in the form of Radix tokens ($XRD). The more Xseed tokens you stake, the higher your potential rewards. This provides an opportunity for passive income generation within the Radix ecosystem. Network Participation By joining the Xseed native token ecosystem, you become an active participant in the Radix Public Network. This allows you to contribute to the network's security and decentralization by staking your $XSEED. Active participation helps strengthen the network and its capabilities. Ecosystem Currency $XSEED serve as an ecosystem currency within the MetaXSeed Games platform. By joining, you can utilize $XSEED for in-game purchases, transactions, and interactions within the decentralized economy of the platform. This fosters a more seamless and efficient gaming experience. Utility within the Platform Being a part of the $XSEED ecosystem may provide additional utility within the MetaXSeed Games platform. This could include exclusive in-game benefits, community governance rights, or participation in platform development decisions. These added benefits enhance the overall user experience and engagement. Token Utility $XSEED has various utility and use cases within the Radix ecosystem and the MetaXSeed Games platform. Here are some of the token utilities associated with $XSEED: Staking $XSEED can be staked on the Radix Public Network, allowing token holders to contribute to the network's security and decentralization. By staking $XSEED, users can earn staking rewards in the form of Radix tokens ($XRD). Rewards Holding $XSEED long-term enables users to earn staking rewards. The more $XSEED a user holds and stakes, the higher their potential rewards. Ecosystem Currency $XSEED serve as an ecosystem currency within the MetaXSeed Games platform. They streamline transactions within the decentralized economy of the platform. This allows for seamless and efficient in-game purchases and interactions. Additional Value Propositions $XSEED holders may get access to other value propositions within the MetaXSeed Games platform. These could include exclusive in-game benefits, community governance rights, or participation in platform development decisions. Staking Process To stake (https://www.radixdltstaking.com/xseed-token/) $XSEED on the Radix Public Network, here's a general process to follow: Obtain $XSEED You must first obtain $XSEED through a reputable exchange that supports its trading. Alternatively, you can receive Xseed tokens as a reward for participating in the Radix ecosystem or by delegating to a node that rewards $XSEED to their delegators. Connect Wallet Next, you must connect to a compatible wallet that supports $XSEED staking. The Radix wallets currently supporting $XSEED staking are the Radix Desktop Wallet and Ledger Nano S hardware wallet. Stake $XSEED tokens Once you have connected your wallet, you can delegate $XSEED to a validator node. This can be done through a staking dashboard in your wallet or through a validator's dedicated staking portal. Wait for Rewards Upon successful staking, you will begin to earn staking rewards in Radix tokens ($XRD). The rewards are proportional to the number of $XSEED you stake and can be claimed at regular intervals. ## XRDegen URL: https://radix.wiki/ecosystem/xrdegen Updated: 2026-02-06 Summary: XRDegen is an NFT launchpad and marketplace built on Radix. The platform provides a comprehensive suite of tools for creators to launch, manage, and monetize th XRDegen is an  NFT launchpad and marketplace (https://xrdegen.gitbook.io/xrdegen)  built on Radix. The platform provides a comprehensive suite of tools for creators to launch, manage, and monetize their non-fungible token (NFT) collections while offering traders a secure environment for buying and selling digital assets. https://youtu.be/FhSy_RZdyyY (https://youtu.be/FhSy_RZdyyY) Developed in the United States, XRDegen offers a user-focused approach to NFT trading with features designed to benefit both creators and collectors. The platform's core infrastructure includes an  NFT marketplace, Web3 portfolio tools, and a launchpad (https://xrdegen.com/)  for new collections. The platform distinguishes itself through several key features, including a  randomized minting process (https://xrdegen.gitbook.io/xrdegen/artwork-reveal)  that ensures fairness during initial drops, a  creator-friendly royalty system (https://xrdegen.gitbook.io/xrdegen/royalties-for-creators)  that provides ongoing revenue from secondary sales, and a  transparent fee structure (https://xrdegen.gitbook.io/xrdegen/marketplace-fees)  with competitive rates compared to other major NFT marketplaces. XRDegen aims to serve as a central hub for Radix-based NFT activity, enabling creators to easily launch new collections through CSV file uploads and allowing collectors to discover, mint, and trade NFTs through an intuitive interface. The platform also supports  claiming pre-existing collections (https://xrdegen.gitbook.io/xrdegen/claiming-pre-existing-collections)  that were launched before XRDegen's creation, allowing those creators to benefit from the platform's features and royalty system. Platform Features XRDegen provides a comprehensive ecosystem for NFT creation and trading on the Radix blockchain, with three core components: the marketplace, launchpad, and creator dashboard. NFT Marketplace The  XRDegen marketplace (https://xrdegen.com/)  enables users to mint, buy, sell, and make offers on NFTs from any collection on the Radix blockchain. The platform maintains competitive pricing with a  2% fee on every trade (https://xrdegen.gitbook.io/xrdegen/marketplace-fees) , which is comparable to Magic Eden (2%) and lower than OpenSea (2.5%). This marketplace serves as the central trading hub where collectors can discover NFTs and creators can benefit from ongoing royalties from secondary sales. Launchpad The  XRDegen launchpad (https://xrdegen.gitbook.io/xrdegen/launch-your-collection/why-launch-with-xrdegen)  provides creators with tools to launch and manage NFT collections. Key features include: The launchpad facilitates  collection creation through CSV file uploads (https://xrdegen.gitbook.io/xrdegen/launch-your-collection/launch-your-collection/format-your-csv-file) , allowing creators to efficiently organize asset data including names, descriptions, image URLs, and metadata traits. Once launched, collections appear in the DROPS section for easy discovery by potential buyers. To ensure fairness during minting, XRDegen employs a  unique artwork reveal system (https://xrdegen.gitbook.io/xrdegen/artwork-reveal)  that only displays the NFT's metadata and artwork after a transaction is submitted. This prevents users from canceling transactions until they find a rare NFT, maintaining the element of surprise that is essential to fair NFT distribution. The platform charges an  initial collection setup fee of 2000 XRD and minting fees of 10% of the mint price (https://xrdegen.gitbook.io/xrdegen/launchpad-fees)  (or 25 XRD, whichever is greater) per NFT minted. For collections priced below 25 XRD, creators must pay minting costs upfront but retain 100% of funds raised from sales. Creator Dashboard The  Creator Dashboard (https://xrdegen.gitbook.io/xrdegen/launch-your-collection/the-dashboard)  serves as the control center for managing NFT collections. Through this interface, creators can: - Submit new collections with comprehensive details including visuals, descriptions, categories, and features - Edit existing collections to keep information current - Access analytics data on collection performance, including views, interactions, and transaction volumes - Manage their creator profile and connect social media accounts Each collection receives a  dedicated landing page (https://xrdegen.gitbook.io/xrdegen/your-collection-landing-page)  with a custom URI (XRDEGEN.com/collections/[custom-name]) that displays collection details, social links, and performance metrics such as floor price and all-time volume. Notably, XRDegen also supports  claiming pre-existing collections (https://xrdegen.gitbook.io/xrdegen/claiming-pre-existing-collections)  launched before the platform existed. Creators can verify ownership by connecting a wallet containing the project's Owner's Badge, then customize their collection's presence on XRDegen and begin earning royalties from secondary sales. Technical Details Minting Process The  minting process on XRDegen (https://xrdegen.gitbook.io/xrdegen/the-minting-process)  is designed to be straightforward for both creators and collectors. When a collection is launched, it appears in the DROPS section of the platform, making it discoverable to potential buyers. Users can mint directly from a collection's landing page by accessing the custom URI (XRDEGEN.com/collections/[custom-name]) or by searching for the policy ID. The platform supports efficient batch minting, allowing users to purchase up to 25 assets in a single transaction. Once a transaction is signed, NFTs are instantly minted on the Radix blockchain and transferred to the buyer's wallet. Artwork Reveal System XRDegen has implemented a  specialized artwork reveal system (https://xrdegen.gitbook.io/xrdegen/artwork-reveal)  to address a unique challenge with Radix's Transaction Manifest. While the Transaction Manifest provides transparency by letting users see exactly what they're getting in a transaction, this creates an issue for NFT minting where users might cancel and resubmit transactions until they secure the rarest piece in a collection. To maintain fairness, XRDegen only reveals artwork and metadata after a transaction is submitted. The reveal process completes within minutes of minting, ensuring that the minting experience remains unpredictable and equitable for all participants. CSV File Format Collections are launched through a  CSV (Comma-Separated Values) file (https://xrdegen.gitbook.io/xrdegen/launch-your-collection/launch-your-collection/format-your-csv-file)  that contains all necessary data for the NFTs. Creators can use spreadsheet applications like Google Sheets or Excel to prepare their data, with column headers exactly matching XRDegen's required format. The CSV must include essential information such as Asset Name, Description, and Image URL. Creators can add custom traits by creating additional columns with the prefix "metadata." (e.g., "metadata.background"). Capitalization is critical in the headers, and spaces should be avoided to prevent errors during the upload process. Royalty Implementation The  royalty system (https://xrdegen.gitbook.io/xrdegen/royalties-for-creators)  enables creators to earn ongoing revenue from secondary sales of their NFTs. When setting up a collection, creators specify their desired royalty percentage (up to a maximum of 10%) and designate a wallet address to receive payments. For example, with a 5% royalty rate set on a collection, if an NFT sells for 1000 XRD on the marketplace, the creator automatically receives 50 XRD from that transaction. Notably, XRDegen allows all Radix NFT creators to set up royalties for their collections, regardless of whether they originally launched through the platform or independently. The royalty percentage can be adjusted over time through the Creator Dashboard, providing flexibility as market conditions change or as collections mature. ## Xidar URL: https://radix.wiki/ecosystem/xidar Updated: 2026-02-06 Summary: XIDAR is a service and tool provider specializing in Web3 and DeFi (Decentralized Finance). It offers a range of user-friendly, robust solutions that aim to str XIDAR is a service and tool provider specializing in Web3 and DeFi (Decentralized Finance). It offers a range of user-friendly, robust solutions that aim to streamline the Web3 and DeFi space, enabling users to easily navigate and explore the future digital landscape regardless of their technical knowledge. XIDAR believes that technical expertise and comprehension of Web3 and cryptocurrencies should not be a barrier for users to take advantage of the ongoing technological and financial paradigm shift. Platform XIDAR's platform is built on Radix, a decentralized ledger technology (DLT) network. Radix was chosen by XIDAR due to its unique, comprehensive, integrated bottom-up technological approach. XIDAR views Radix as the only platform that can fully support its vision of Web3 DeFi and offers the ability to fully leverage a decentralized network that enables fast and secure development without sacrificing scalability or composability. Products XIDAR Wallet The XIDAR Wallet is an industry-leading browser and soon-to-be mobile wallet for the Radix network. It allows users to manage, send, receive, stake, and swap all tokens and NFTs (Non-Fungible Tokens) from the Radix DLT network. It also features the ability to create or import multiple accounts and addresses, manage balances, search transaction history, stake XRD tokens, swap tokens in-app, add contacts, create native Radix tokens, recover wallet automatically, send tokens to multiple addresses in one transaction, and buy XRD directly with a credit card. $IDA Token The $IDA token is a utility token within the XIDAR ecosystem that lives on the Radix DLT network. It has a fixed supply of 240 million tokens. The $IDA tokens will represent a vital utility asset within the XIDAR ecosystem and will continue to play a pivotal role in all its products and services. In particular, $IDA tokens will empower users with decision-making capabilities in XIDAR's upcoming investment DAO (Decentralized Autonomous Organization) and provide access to premium features within the XIDAR wallet. XIDAR plans to announce further benefits related to its upcoming no-code dApp creator. Tokenomics The tokenomics for the $IDA token are as follows: - Ecosystem/Community Rewards: 96,000,000 $IDA (40%) - Liquidity & Exchanges: 16,800,000 $IDA (7%) - Private Sales: 15,600,000 $IDA (6.5%) - Tech & Dev: 36,000,000 $IDA (15%) - Team & Advisors: 32,400,000 $IDA (13.5%) - Marketing: 28,800,000 $IDA (12%) - Treasury: 14,400,000 $IDA (6%) The vesting period for founders is 12 months with no cliff. ## WhyNot URL: https://radix.wiki/ecosystem/whynot Updated: 2026-02-06 Summary: # WhyNot some Introduction? WhyNot some Introduction? It all started with the question, $WHY don't you create your own meme coin? Followed by the answer, actually #WhyNot? And so, the idea for a meme coin driven by the community on the Radix network was born. Let's build on Radix and spread the word! All you need ➡️ https://linktr.ee/WhyNotXRD (https://linktr.ee/WhyNotXRD) ## VikingLand URL: https://radix.wiki/ecosystem/vikingland Updated: 2026-02-06 Summary: VikingLand is an NFT marketplace that operates on the Radix platform. The platform was launched in February 2022. The organization has no direct influence on th VikingLand is an NFT marketplace that operates on the Radix platform. The platform was launched in February 2022. The organization has no direct influence on the price of the NFTs sold on the platform and charges a 5% royalty fee and a 2.5% marketplace fee. Collections VikingLand hosts a variety of collections, including "Undying Vikings", which is an official collection of VikingLand. There are several other official collections as well, such as "RaEggs", "RaDragons", and "Viking Warrior". Revenue Distribution 10% of VikingLand's revenue is distributed among holders of their official collections, which includes "RaEggs", "RaDragons", and "Viking Warrior". Projects VikingLand is home to several NFT projects, some of which were among the first and have since sold out on the Radix DLT. ## UniX URL: https://radix.wiki/ecosystem/unix Updated: 2026-02-06 Summary: UniX is a privacy-focused social media platform being built on Radix. UniX is a privacy-focused social media platform being built on Radix. Introducing UniX: Bridging the Gap Between Social Connectivity and Web3 In the rapidly evolving landscape of Web3, where decentralized applications (dApps) are reshaping the way we interact with the digital world, UniX emerges as a trailblazing project that redefines social networking as we know it. UniX represents a bold step towards achieving a secure, user-centric, and innovative social platform that respects privacy and empowers individuals. Privacy-Centric Social Networking One of the standout features that sets UniX apart from conventional social media platforms is its unwavering commitment to user privacy. In a world where personal data is often commodified, Radix introduces 'Persona,' a groundbreaking feature that allows users to share only the information that's truly necessary, without the need for email addresses or phone numbers. Your online identity remains truly yours, safeguarding your privacy in an unprecedented way. Empowering User Control UniX isn't just a platform; it's a movement to put users back in control of their digital lives. Traditional social platforms often dictate how users interact and what they can or cannot do. UniX flips the script, placing you firmly in the driver's seat. You decide who sees your content, who you connect with, and how you share your experiences. UniX empowers you to craft your online journey precisely the way you envision it. Built on Radix: Powering Innovation UniX's strong foundation is powered by Radix, a revolutionary technology stack that unlocks unprecedented possibilities. Radix's human-readable transactions simplify user interactions, making UniX exceptionally user-friendly. The wallet connect feature ensures secure and seamless transactions, fostering trust and usability. Transparency and Trust In a world where the inner workings of platforms are often shrouded in mystery, UniX stands as a beacon of transparency. We prioritize transparency, sharing our processes and much more with our community. Trust is built on transparency, and UniX is committed to fostering trust among its users. Versatility and More UniX aims to be your all-in-one solution for your social presence. From blog posting to connecting with others to directly transferring assets and more, UniX provides every tool you need for a seamless and enriching online experience. We're not just a social media platform; we're a comprehensive ecosystem that empowers you to express, connect, and transact. Join Us in Shaping the Future As we embark on this transformative journey, we invite you to join us in shaping the future of social networking. UniX is more than just a dApp; it's a testament to the power of user-centric innovation. We're committed to building a secure, privacy-focused, and innovative platform where you are the master of your digital destiny. Stay tuned for exciting updates, collaborations, and the incredible experiences that await you in the UniX ecosystem. The future of social networking is here, and it's defined by you, for you. Welcome to UniX, where the future of social meets the decentralized web. Your journey begins now. ## UNISCI URL: https://radix.wiki/ecosystem/unisci Updated: 2026-02-06 Summary: Unisci is an NFT project built on the Radix Distributed Ledger Technology (DLT). We specialize in creating handmade NFTs, with each NFT representing a unique th Unisci is an NFT project built on the Radix Distributed Ledger Technology (DLT). We specialize in creating handmade NFTs, with each NFT representing a unique theme that changes with the seasons. Our focus on combining craftsmanship with the power of Radix DLT allows us to offer collectors an exclusive and diverse collection of digital artworks. Features Our primary objectives include: - Building a strong and exclusive community. - Establishing a Vault that supports the growth and development of exciting new projects. The Project Get ready to join the Unisci revolution - where we combine the unique and the scientific for a project that's truly out of this world! The first part of our name 'unique' perfectly reflects our NFT drops, each one a stunning masterpiece unlike anything you've seen before. But that's just the beginning! The second part of our name 'scire' represents the additional value we bring to the table with our innovative project. So what are you waiting for? Come join us on this incredible journey! Unique Step into our world of handmade wonder with our collection of 500 unique NFTs, each one released piece by piece for your viewing pleasure! We've got a grand plan of 5 seasons, each packed with 100 incredible NFTs, all with their own unique themes that'll make your imagination soar. And the best part? Every single NFT is lovingly crafted by us, with no automatic trait generators in sight. It's like stepping into a whole new universe of digital art, and we can't wait for you to be a part of it! Scire Join the unstoppable Unisci community, where we're all about sharing information, knowledge, and ideas to our holders. Wondering how we make it happen? It's simple! We pour a good amount of the income from our mint into our vault, ensuring that we have the resources we need to keep building and innovating. The Unisci Vault A big amount of the total mint will be devoted to investing in cutting-edge Radix projects, including eye-catching NFTs, as well as developing our very own project. Get ready to join a community of like-minded individuals passionate about pushing the boundaries of what's possible! Team The Unisci team consists of a project manager named Cala and artist named Flok. Seasons Season 1: Radix Community The first theme of our collection is: Radix community! That means you can put together your own avatar, which we will draw for you! And it gets even better: our season 1 will be free to mint!!! 88 have successfully applied and received their nft. The remaining 12 spots will be assigned by us in the course of the project. Season 2: Arctic Prepare to embark on a thrilling adventure as we unveil our second season - a journey into the stunningly beautiful, yet dangerously unforgiving, Arctic wilderness! Our expedition team is braving the icy tundra, encountering all sorts of creatures along the way - from the adorable and cuddly to the fierce and ferocious. But be warned - danger lurks behind every snowbank, and even the most innocent-looking creature may not be what it seems! The misty veil of the Arctic hides many secrets, and only the bravest and most daring explorers will survive this epic journey. So grab your parka and join us on the trip of a lifetime! Season 3: Universe Our third season takes us on an interstellar journey into the vast and mysterious universe. Our spaceship, the Unistar, is home to a diverse crew of astronauts and adventurers. As we travel through the galaxies, we encounter strange and fascinating creatures, visit exotic planets, and discover ancient civilizations. But not all is as it seems in the infinite expanse of space. Hidden behind the glittering stars and colorful nebulas, there are dangerous forces at work. We must be vigilant and careful as we navigate the unknown depths of the universe. ## Trove URL: https://radix.wiki/ecosystem/trove Updated: 2026-02-06 Summary: Trove is a digital asset trade management tool allowing users to swap NFTs and various tokens efficiently and securely. The platform removes the need for interm Trove is a digital asset trade management tool allowing users to swap NFTs and various tokens efficiently and securely. The platform removes the need for intermediaries, providing a trustless experience with simplicity and efficiency. Runs on Radix Q&A: Trove (https://youtu.be/vf5b8llQERI?si=RwbS-yw7e6-_Zypu) Runs on Radix Q&A: Trove Overview Building Trove on Radix offers benefits like fewer deal size constraints, clear transaction manifests, low transaction fees, and inherent scalability. The platform emphasizes secure and direct user-to-user trades and eliminates the need for intermediaries or drawn-out escrow processes. Trove is noted for its simplicity and efficiency, being built with less than 300 lines of Scrypto code. Key features of Trove include: - Platform for Managing and Advertising Digital Assets Trades: Offers an effortless and secure medium for managing swaps of digital assets. - Scalable and Efficient: Crafted with less than 300 lines of Scrypto code, unparalleled if built on platforms like Ethereum or Polygon. - Reduced Deal Size Constraints, Minimal Gas Fees, Transaction Manifest, and More: Allows users to potentially transact hundreds or even thousands of assets in one go, with low transaction fees and unparalleled clarity and assurance. History Early Experiments on Olympia Mainnet The Olympia mainnet on Radix was the initial platform of choice for @ripsource's experiments. The first project, named "Rippy", was a parody of the Radix assistant, hosted on a rudimentary website designed with basic HTML, CSS, and segments of copied JavaScript. This early iteration was plagued by slow load times, with pages often taking up to 15 seconds to load. Recognition at RadFi2022 Creative Competition Transitioning from a personal hobby, @ripsource's work began to gain external acknowledgment. This recognition was highlighted by the win at the RadFi2022 Creative Competition held by @radixdlt. The victory resulted in increased attention, leading to a larger audience discovering the project. Collaborative Projects The post-competition period saw @ripsource collaborating with other teams within the decentralized application community. One notable partnership was the development of a collection website for @PenguinsXRD, available at http://collection.radicalpenguins.com (http://collection.radicalpenguins.com/) . Diving into Smart Contracts with Scrypto 101 A pivotal phase in @ripsource's journey was enrollment in Radix’s Scrypto 101 course. The structured modules within this course enabled @ripsource to transition from having no knowledge of smart contract creation to launching an NFT-based retro arcade game on RCNet V1. Introduction of Trove Utilizing the skills and knowledge accumulated, @ripsource conceptualized Trove. The primary objective of Trove was to offer a solution to the Radix community, allowing for a secure and trustless mechanism for users to manage and swap assets. Role of Radix in Web3 Development The development and success of Trove highlighted Radix's potential in simplifying the traditionally complex aspects of dApp and smart contract creation. With its user-friendly tools and platforms, Radix aims to make #web3 more accessible to a broader range of users and potential developers. Features - Singular or Bundle Trades: Trade NFTs, a bundle of NFTs, or various tokens in consolidated deals. - Scalable and Efficient: Crafted with less than 300 lines of Scrypto code, outperforming similar platforms like Ethereum or Polygon. Why build on Radix? Radix's native asset handling and inherent scalability enable Trove to offer reduced deal size constraints, minimal gas fees, and transaction manifests, making trades on Trove frictionless. Benefits of Building on Radix - Reduced Deal Size Constraints: Potentially transact hundreds or thousands of assets in one go. - Minimal Gas Fees: Low transaction fees as compared to networks like Ethereum. - Inherent Scalability: Managed effortlessly by the Radix Engine. Trove in Action: A User's Guide The platform’s usability can be summarized through a scenario of Alice and Bob trading NFTs: Trade Proposal Creation - Alice sets up a Trade Proposal on Trove. - Specifies the NFTs and tokens to be traded. - Trade proposal is created, and Alice receives an NFT badge with details. Trade Acceptance - Bob joins the trade using Alice's code. - Verifies and accepts the trade. Trade Completion - Alice authorizes the release of new NFTs and tokens. - Trade is completed in a single transaction. Founder: Ripsource Ripsource is Trove's sole founder and developer. He transitioned into dApp development on Radix and designed Trove to foster Radix's unique bartering culture. Background - Finance and Renewable Energy Engineering: Ripsource's original background. - Radix Development: A significant creative outlet, growing expertise in Scrypto. - Future Prospects: Driven by opportunities in the Radix community, Ripsource hopes to commit full-time. ## Topradixnode URL: https://radix.wiki/ecosystem/topradixnode Updated: 2026-02-06 Summary: Topradixnode is a node operator that provides staking services on the Radix platform. Topradixnode is a node operator that provides staking services on the Radix platform. Overview The services they offer include staking, which allows token holders to earn rewards on their holdings by participating in the Radix network. As a trusted Radix validator node, Topradixnode offers a robust infrastructure and round-the-clock availability for the Radix network. TopRadixNode is an official partner of Impahla DAO and a validator on the Radix network. They have a stake of 100,912,811 (2.81%) ON tokens. Mission Topradixnode's mission is to contribute to the growth and development of the Radix network by providing reliable staking services. Topradixnode play a crucial role in decentralized networks by securing the network through the participation of validator nodes. By offering staking services, they enable token holders to delegate their tokens and contribute to the consensus and security of the Radix network. The broader mission of Radix is to become the foundation for a decentralized financial system capable of supporting the scale and complexity of modern economies. https://youtu.be/tqWmkPp2yF0?si=NfcFdY9qiMqpSLbP (https://youtu.be/tqWmkPp2yF0?si=NfcFdY9qiMqpSLbP) Product and Services The products and services offered by Topradixnode include the following: Staking Services As a trusted Radix Validator node, Topradixnode provides secure staking services for Radix token holders. They offer a robust infrastructure, unrivaled transparency, and round-the-clock availability on the Radix network. Campaigns and Collaborations Topradixnode frequently runs exciting campaigns and collaborates with various projects within the Radix ecosystem to maximize benefits for their clients. Radix Network Support Topradixnode provides guidance to individuals interested in staking their XRD tokens. They assist with steps such as buying XRD from multiple exchanges and using the Radix Wallet. These offerings aim to provide a seamless staking experience for clients while contributing to the growth and development of the Radix ecosystem. Rewards To start earning rewards, users need to download the Radix Desktop Wallet, load it with XRD tokens, select the validator nodes they want to delegate their stake to, and delegate at least 100 $XRD tokens.The rewards are distributed yearly, with 300 million XRD tokens allocated for emissions. Security and Trust Topradixnode operate with the highest level of transparency and security. Here are a few reasons why you can trust and rely on Topradixnode: Team Topradixnode is made up of experienced professionals with a background in financial technology, security, and DevOps. These experts are dedicated to providing high-quality, secure staking services and supporting the growth of the Radix ecosystem. Secure infrastructure Topradixnode runs on a highly secure infrastructure that is protected by enterprise-grade security measures, including multi-layered firewall protection, DDoS protection, and intrusion prevention. Transparency Topradixnode is highly transparent about their operations, and they provide a detailed breakdown of their staking rewards on their website, allowing users to see exactly how their rewards are earned. High Availability Topradixnode provides round-the-clock availability on the Radix network. They also offer daily monitoring and maintenance to ensure the stability and uptime of their infrastructure, providing a seamless staking experience for their clients. User reviews You can find many positive reviews and feedback from users who are satisfied with Topradixnode's staking services. These reviews can help to establish the trustworthiness of Topradixnode as a provider of Radix staking services. ## The Meme Studio URL: https://radix.wiki/ecosystem/the-meme-studio Updated: 2026-02-06 Summary: The Meme Studio is a pioneering Web3 Marketing Agency that specializes in DeFi, Crypto, Metaverse, NFT, and P2E gaming sectors. The agency's mission is to suppo The Meme Studio is a pioneering Web3 Marketing Agency that specializes in DeFi, Crypto, Metaverse, NFT, and P2E gaming sectors. The agency's mission is to support a broad spectrum of blockchain enthusiasts, including crypto OGs, degens, protocol founders, NFT artists, the crypto-curious, decentralized technologists, and Web3 disruptors. They offer expertise in content-focused social media marketing strategies, public relations, influencer marketing, events management, and other creative resources for effective community building. Services The Meme Studio designs Web3 marketing campaigns to elevate token growth, foster and gamify community engagement, onboard Web2 and non-crypto users, and forge strategic partnerships across industries. The agency collaborates with players from Fintech and TradFi to VCs, gaming companies, and iconic fashion labels. Their approach involves deep market research, full-service content production, niche media buying, active data-driven community management, and a dedicated crypto marketing agency with experience in the crypto space. Ethos The Meme Studio's ethos revolves around ensuring clients sustain long-term brand presence and make a lasting impact, beyond market cycles. The agency seeks to establish long-term partnerships in the creative, marketing, and strategic aspects of projects. They offer reasonable pricing and bill clients based on results, not time. History Established during the DeFi summer of 2020, The Meme Studio began by supporting early crypto projects with humorous and engaging content, including memes, which led to its unique name. Since then, the agency has worked with leading NFT projects, P2E blockchain games, DeFi protocols, and Metaverse companies, solidifying its reputation as a world-leading Web3 Marketing Agency. Team The Meme Studio's team comprises Web 3.0 native artists and marketing experts who have lived and breathed crypto for years, mastering the art of Web 3.0 marketing. They understand the target audience of crypto degens, NFT collectors, and early adopters of the metaverse, acknowledging that traditional digital marketing methods are no longer sufficient. Strategy The Meme Studio's marketing strategy is based on understanding, creativity, and adaptability. The team stays ahead of the ever-evolving blockchain landscape, ensuring that their strategies are aligned with current trends and technologies. Their services extend beyond traditional marketing channels, delving into niche platforms that cater to the Web3 audience, including Twitter, Discord, Telegram, Reddit, YouTube, and 4chan's /biz/ forum. The agency is known for its expertise in meme marketing and knowledge of growing crypto meme characters, such as Bogdanoff, various crypto Wojak variations, Bobo Bear, Mumu Bull, and Sminem. Analytics The Meme Studio emphasizes the importance of data-driven decision-making. They use advanced analytics tools to monitor the success of campaigns, allowing them to make informed adjustments to optimize performance. Future Vision The Meme Studio is dedicated to helping clients build a strong brand presence and establish lasting connections with their target audience. Their commitment to excellence, creativity, and integrity has helped them forge lasting partnerships with numerous clients in the DeFi, Crypto, Metaverse, NFT, and P2E gaming sectors. ## The Hard Money Project URL: https://radix.wiki/ecosystem/the-hard-money-project Updated: 2026-02-06 Summary: The Hard Money Project (THMP) is a research and educational initiative that focuses on analyzing monetary systems and Bitcoin's role in a potential new financia The Hard Money Project (THMP) is a research and educational initiative that focuses on analyzing monetary systems and Bitcoin's role in a potential new financial paradigm. Founded by Lluis Aragones (https://hardmoneyproject.substack.com/p/welcome-to-the-hard-money-project) , Head Economist at RDX Works (RDX%20Works%208092803705114cd1b34bbef5dd98d7af.md) , the project examines what it describes as a contemporary "Bretton Woods moment" (https://hardmoneyproject.substack.com/p/welcome-to-the-hard-money-project) in the digital era, referring to the fundamental transformation of the monetary system through digitalization. https://youtu.be/e86CN2FslKs (https://youtu.be/e86CN2FslKs) Overview The Hard Money Project positions itself as an analytical framework for understanding the evolution and future of monetary systems during a period of significant technological and financial change. The project argues that the current momentum toward centralizing control over money on digital, programmable unified ledgers represents a pivotal shift (https://hardmoneyproject.substack.com/p/welcome-to-the-hard-money-project) in how monetary systems operate and are controlled. A central focus of the initiative is examining the distinction between money and currency, with particular attention to Bitcoin's role as a form of "hard money." The project posits that while all currency is money, not all money can sustainably function as currency (https://hardmoneyproject.substack.com/p/are-we-talking-money-or-currency) , presenting this as crucial for understanding modern monetary systems. The project's analysis extends beyond traditional economic frameworks to incorporate historical, anthropological, and technological perspectives. It traces the evolution of monetary systems from gift economies through barter systems to modern fiat currencies (https://hardmoneyproject.substack.com/p/money-i-the-past-of-trial-and-error) , examining how each stage has influenced current monetary structures and potential future developments. A significant aspect of the project's research involves analyzing Satoshi Nakamoto's original vision for Bitcoin. The project argues that Bitcoin was designed as a foundational step toward a broader vision of "dynamic smart money" and "programmable P2P social currencies" (https://hardmoneyproject.substack.com/p/inception-satoshi-nakamoto-monetary) , rather than as a standalone currency system. The initiative combines theoretical analysis with practical implications, examining how monetary systems affect sovereignty and economic freedom. It advocates for critical thinking about monetary design and policy, particularly in the context of increasing digitalization and centralization of financial systems. Key Concepts Money vs Currency Analysis The Hard Money Project develops a detailed framework for distinguishing between money and currency, which forms a foundational element of its monetary analysis. The project defines money as any asset that can serve as a medium of exchange, while currency is specifically money in motion - the subset of money that actively circulates within an economy (https://hardmoneyproject.substack.com/p/are-we-talking-money-or-currency) . The analysis traces the historical evolution of monetary systems, beginning with prehistoric gift economies. In these early systems, communities operated through informal reciprocal exchanges without immediate expectations of return, fostering social bonds and trust rather than direct transactions (https://hardmoneyproject.substack.com/p/money-i-the-past-of-trial-and-error) . This foundation helps explain the social and anthropological aspects of monetary systems that the project argues remain relevant today. The project identifies several key properties that determine monetary effectiveness (https://hardmoneyproject.substack.com/p/are-we-talking-money-or-currency) : - Scarcity to prevent devaluation. - Divisibility for practical use. - Portability for easy transfer. - Fungibility for uniform value. - Durability for long-term stability. - Counterfeit resistance for security. - Immutability and censorship resistance for user control. Bitcoin Analysis The project provides a detailed examination of Bitcoin's role in monetary evolution, with particular attention to Satoshi Nakamoto's original vision. According to the project's analysis of Nakamoto's early communications, Bitcoin was designed as a "basic P2P currency" intended to serve as a foundation for future "programmable P2P social currencies" (https://hardmoneyproject.substack.com/p/inception-satoshi-nakamoto-monetary) . A key argument in the project's analysis is that Bitcoin's fixed supply model makes it more suitable as a store of value than as a currency (https://hardmoneyproject.substack.com/p/inception-satoshi-nakamoto-monetary) . The project points out that Bitcoin's design inherently encourages holding rather than circulation, with over 92% of holders not spending any bitcoin for more than a month as of October 2024. Monetary Theory The project develops a comprehensive analysis of monetary evolution, from early commodity-based systems through to modern fiat currencies. It examines how the gold standard emerged as a solution to the limitations of bimetallic systems, providing stability until its eventual collapse during World War I (https://hardmoneyproject.substack.com/p/money-ii-from-central-banking-to) . The analysis pays particular attention to the Bretton Woods system and its aftermath (https://hardmoneyproject.substack.com/p/money-ii-from-central-banking-to) , examining how the current global fiat system emerged from the "Nixon Shock" of 1971. The project argues that this represents the first truly global fiat experiment in history, operating without any neutral anchor or commodity backing. The project's monetary theory emphasizes the importance of local economic contexts in currency function. It argues that for a currency to maintain stable purchasing power, it must be able to adapt its supply to local economic conditions (https://hardmoneyproject.substack.com/p/are-we-talking-money-or-currency) . This leads to a critique of fixed-supply monetary systems, including both traditional gold standards and modern cryptocurrencies like Bitcoin. A central theme in the project's theoretical framework is the distinction between a currency's domestic purchasing power and its foreign exchange value. The analysis suggests that successful currency systems must balance both aspects while maintaining stability and predictability (https://hardmoneyproject.substack.com/p/are-we-talking-money-or-currency) . This understanding informs the project's broader critique of current monetary systems and its vision for future developments. Key Arguments The Hard Money Project presents several central arguments about monetary systems and their evolution. These arguments form the theoretical foundation of the project's analysis and inform its vision for future monetary development. Decentralization and Monetary Control The project argues that the current trend toward digital transformation of money represents a critical juncture in monetary history, comparable to the 1944 Bretton Woods conference (https://hardmoneyproject.substack.com/p/welcome-to-the-hard-money-project) . A key concern raised is the increasing centralization of monetary control through digital systems, which the project suggests could lead to unprecedented levels of financial surveillance and control. The analysis particularly focuses on the risks of centralized digital ledgers controlled by governments and central banks, where all payments would settle (https://hardmoneyproject.substack.com/p/welcome-to-the-hard-money-project) . The project posits that this development could further consolidate monetary power in ways that might compromise individual financial sovereignty. Bitcoin's Role and Limitations The project presents a nuanced analysis of Bitcoin's position in monetary evolution. Based on examination of Satoshi Nakamoto's original communications, the project argues that Bitcoin was intentionally designed as a foundation for future monetary development rather than as a complete currency system (https://hardmoneyproject.substack.com/p/inception-satoshi-nakamoto-monetary) . A significant argument is that Bitcoin's fixed supply model, while making it effective as a store of value, inherently limits its functionality as a currency (https://hardmoneyproject.substack.com/p/inception-satoshi-nakamoto-monetary) . The project points to the high percentage of bitcoin being held rather than circulated as evidence of this limitation, suggesting that the asset's design encourages accumulation over circulation. Monetary Supply Dynamics The project develops a detailed critique of fixed-supply monetary systems, arguing that effective currencies require elastic supply mechanisms that can adapt to local economic conditions (https://hardmoneyproject.substack.com/p/are-we-talking-money-or-currency) . This argument extends to both traditional gold standards and modern cryptocurrency systems. Historical analysis is used to demonstrate how previous attempts at rigid monetary systems, including the classical gold standard, ultimately faced challenges due to their inability to adapt to changing economic conditions (https://hardmoneyproject.substack.com/p/money-ii-from-central-banking-to) . The project suggests that this historical pattern remains relevant for understanding contemporary monetary developments. Future Monetary Paradigm The project envisions a future monetary system that combines Bitcoin's role as a neutral base money with more flexible, community-oriented currencies (https://hardmoneyproject.substack.com/p/inception-satoshi-nakamoto-monetary) . This system would aim to balance the benefits of hard money with the practical needs of economic communities. Drawing on Satoshi Nakamoto's early writings, the project argues for the development of "programmable P2P social currencies" built on Bitcoin's foundation (https://hardmoneyproject.substack.com/p/inception-satoshi-nakamoto-monetary) . These would be designed to provide the supply elasticity needed for effective currency function while maintaining the benefits of decentralization. Critique of Current Fiat System The project presents a comprehensive critique of the current global fiat system, describing it as the first worldwide fiat experiment in history (https://hardmoneyproject.substack.com/p/money-ii-from-central-banking-to) . The analysis suggests that this system, lacking any neutral anchor or commodity backing, represents a significant departure from historical monetary arrangements. A key argument is that the current system's centralization of monetary power has led to increased financial instability and wealth inequality (https://hardmoneyproject.substack.com/p/money-ii-from-central-banking-to) . The project suggests that this centralization allows for political manipulation of monetary policy at the expense of economic stability and the common good. ## Surge URL: https://radix.wiki/ecosystem/surge Updated: 2026-02-06 Summary: Surge is a decentralized perpetuals exchange launched on the Radix network as a collaborative effort between companies CaviarNine, KeyRock, and Supra. It is the Surge is a decentralized perpetuals exchange launched on the Radix network as a collaborative effort between companies CaviarNine (CaviarNine%20aeb1379af8a145209d11aba444339559.md) , KeyRock, and Supra (https://www.notion.so/2626aec8c6e0442697bf4ac0de385bb0?pvs=21) . It is the first perpetual DEX built on the Radix blockchain, designed to facilitate off-ledger trading in the decentralized finance (DeFi) ecosystem. Overview Perpetual exchanges like Surge enable traders to take leveraged positions without being constrained by traditional expiry dates. This allows traders to hold positions indefinitely without the need to roll over contracts periodically. Additionally, perpetual contracts provide the ability to trade with leverage, enabling traders to potentially amplify their profits or losses. Surge aims to provide a comprehensive and secure trading platform tailored for these types of perpetual markets. It incorporates features from established protocols while introducing its own innovative elements. Surge was developed by leveraging the expertise of its founding companies in areas such as oracle technology and financial services. CaviarNine, KeyRock, and Supra contributed their respective specialties to create the perpetual DEX. As the first such platform on Radix, Surge represents a significant addition to the network's growing DeFi ecosystem. Key Features Surge incorporates several key features (https://surgetrade.medium.com/introducing-surge-496de2302f11) to provide a robust and user-friendly perpetual trading experience within the Radix ecosystem: - Advanced Trading Platform: Surge supports over 100 trading instruments, catering to a wide array of trading strategies. Accurate price feeds for these instruments are provided by Supra, one of the founding companies known for its expertise in oracle technology. - Innovative Account and Collateral Management: Surge implements a novel account system that combines efficiency and flexibility. Traders have the option to choose between pooled and segregated margin accounts, allowing them to manage their collateral according to their preferences. - User-Friendly Interface: While offering advanced trading capabilities, Surge's interface is designed to be accessible to both novice and experienced traders. Complex operations are simplified without compromising on sophistication, ensuring a smooth learning curve for new users. - Use of Scrypto Smart Contracts: Surge leverages the power of Scrypto, the native smart contract language of the Radix network. This integration ensures a seamless and secure trading experience, taking advantage of the inherent security and scalability features of the Radix blockchain. - Additional Features: Surge offers several additional features to enhance the trading experience, including instant trading without the need for sign-up, advanced referral rewards systems, and more. Further details on these features will be unveiled in the coming months. By combining these key features, Surge aims to provide a comprehensive and user-friendly perpetual trading platform tailored to the needs of the Radix ecosystem. $SRG Token Central to the Surge ecosystem is the SRG token, which serves several critical functions within the platform. This native utility token facilitates governance, fee structures, and rewards mechanisms. Governance: SRG token holders will have the ability to participate in the governance of the Surge protocol. This includes voting on proposed changes, upgrades, and the overall direction of the platform, ensuring a decentralized and community-driven approach. - Fee Structures: The SRG token will play a role in determining the fee structures for trading activities on the Surge platform. Specific details on how fees will be calculated and distributed are yet to be announced. - Rewards: SRG tokens will be used to incentivize and reward various activities within the Surge ecosystem. This may include liquidity provision, referral programs, and other initiatives aimed at fostering a vibrant and engaged community. - Token Allocation and Utility: Comprehensive details regarding the token allocation and full utility of the SRG token will be unveiled in the near future. These details will highlight the advantages and benefits for early supporters and community members. The introduction of the SRG token is a crucial component of the Surge ecosystem, enabling decentralized governance, sustainable fee models, and incentive mechanisms to drive growth and adoption within the perpetual trading platform. Community Surge is committed to fostering a vibrant and engaged community from the early stages of its development. To achieve this, the platform is offering various incentives and rewards for early adopters and community members. Early Access and Registration: Surge is inviting interested traders and users to sign up for early access to the platform through their official website. Early registrants will gain exclusive access to preview and experience the future of DeFi trading on Surge before the wider public launch. Community Engagement and Rewards: Recognizing the importance of community participation, Surge has implemented several initiatives to reward and incentivize engagement: - Early Adopter Rewards: Those who join the platform during the initial stages will be eligible for special rewards and incentives, details of which will be shared in the future. - Referral Rewards: Surge will feature an advanced referral rewards system, encouraging existing users to invite others to join the platform. Referrers and referred users will both benefit from this program. - Testnet Participation: Community members who actively participate in testing and providing feedback during the testnet phase will be rewarded for their contributions. Surge is dedicated to nurturing a robust and active community by providing these incentives and opportunities for early involvement. Regular announcements and updates will be shared through the official Surge Twitter account and Telegram group, keeping the community informed about the latest developments, rewards, and initiatives. By engaging with the community from the outset, Surge aims to establish a strong foundation for the long-term success and growth of its perpetual trading platform within the Radix ecosystem. Future Developments Surge has an ambitious roadmap outlining the continuous expansion and enhancement of its perpetual trading platform. While the initial launch marks a significant milestone, the team behind Surge is committed to continuously improving the platform's features and capabilities. Some key areas of focus for future developments include: - Expanding Trading Instruments: While Surge initially supports over 100 trading instruments, the team plans to further diversify the range of available instruments, catering to a broader spectrum of trading strategies and market demands. - Integrating Advanced Trading Tools: Surge aims to introduce advanced trading tools and analytical features to empower traders with comprehensive market insights and decision-making capabilities. - Enhancing User Experience: Continuous improvements to the user interface and overall user experience are planned, ensuring that Surge remains intuitive and accessible for both novice and experienced traders. - Scaling and Performance Optimizations: As adoption grows, Surge will focus on optimizing its infrastructure and leveraging the scalability benefits of the Radix network to handle increased trading volumes and user activity seamlessly. - Strengthening Security and Compliance: Surge is committed to maintaining the highest standards of security and compliance, with plans to implement additional security measures and regulatory frameworks as the platform evolves. - Community-Driven Developments: Surge values community input and plans to incorporate feedback and suggestions from its user base to shape the platform's future roadmap. While specific timelines and details for these future developments will be shared in the coming months, Surge's roadmap demonstrates the team's dedication to delivering a cutting-edge and constantly improving perpetual trading experience within the Radix ecosystem. ## Supreme Stake URL: https://radix.wiki/ecosystem/supreme-stake Updated: 2026-02-06 Summary: Supreme Stake has been validating since Radix Betanet and will be well into the future. Supreme Stake has been validating since Radix Betanet and will be well into the future. Overview Supreme Stake is a partnership of technology professionals with extensive experience in infrastructure, networking security, and automation. They bring their expertise in running enterprise-level technology to their Radix Validator node. Supreme Stake dedicated technologists and seasoned crypto veterans. Supreme Stake have a Radix OG and others who stumbled upon the technology after years in crypto, realized the potential, and dived in head first. Mission Supreme stake is a partnership of technology professionals with decades of experience in infrastructure, networking, security, and automation. Supreme stake dedicated to providing an outstanding product to the community, and they have the knowledge to deliver that. Product Supreme stake is hosted in a SOC II compliant public cloud and has backup nodes in redundant colocation facilities. Benefits Low Fees 1.5% validator fee means 98.5% of emissions go to our stakers. High Confidence Supreme stake don't cash our earnings to pay for they node, because Supreme stake know keeping it long term will be worth it. Personal Stake Supreme stake have over 750k of our own Radix tokens staked on they node. High Uptime Supreme stake take your trust seriously. They have redundancy and automation to ensure issues are resolved quickly, and downtime is reduced to an absolute minimum. Top tier infrastructure This isn't your weird cousin's basement node.  Supreme stake run validator in a world class public cloud, with enterprise monitoring and DDOS protection in place, because that's the attention it deserves. ## Stream Wallet URL: https://radix.wiki/ecosystem/stream-wallet Updated: 2026-02-06 Summary: Radix Stream is a self-custodian mobile wallet designed for use with Radix. It is touted as the most used self-custodian mobile wallet for Radix DLT. Radix Stream is a self-custodian mobile wallet designed for use with Radix. It is touted as the most used self-custodian mobile wallet for Radix DLT. Overview Radix Stream provides a secure and easy-to-use platform for storing, sending, and receiving tokens on the Radix decentralised ledger. The wallet can be used as a standalone wallet or as a wallet tracker, or both simultaneously. As a self-custodian wallet, users of Radix Stream are the sole controllers of their Seed Phrase and login credentials. This means that the security and control of assets within the wallet are entirely in the hands of the user. Contact and Support Users can reach out to the Radix Stream team via Twitter or through the Telegram and Discord channels. These channels are also used for bug reporting and feature requests. Disclaimer Radix Stream, the Ideomaker SDK, and the developers are not to be held accountable for any loss of tokens or any other damage resulting from the usage of the app. Users are reminded that they are the only ones in control of their Seed Phrase and login credentials, and they bear full responsibility for the security of their assets. ## StakingCoins URL: https://radix.wiki/ecosystem/stakingcoins Updated: 2026-02-06 Summary: StakingCoins is a blockchain node and validator company founded by Marco Michelino, an experienced Italian Linux system administrator and member of the Radix DL StakingCoins (https://radixdlt.stakingcoins.eu) is a blockchain node and validator company founded by Marco Michelino, an experienced Italian Linux system administrator and member of the Radix DLT community. History Marco, the founder of StakingCoins, first began his journey into the world of cryptocurrencies in 2013 when he purchased his first Bitcoin. Over the years, he expanded his activities into running various blockchain nodes. His engagement with the Radix DLT community further deepened when he participated in testing the Radix DLT betanet with his own node. Marco also serves as a Radix Ambassador and runs the unofficial Radix DLT Italian Telegram group (@Radix_Italia). Infrastructure StakingCoins operates on two main nodes. The primary node is a physical server (Dell PowerEdge R440) located in a tier 4 data center in Italy. This data center is equipped with anti-DDoS protection and has excellent international peering routes. The backup node, a Dell PowerEdge R430, is located in another tier 4 data center in the Czech Republic. Validator Proposal StakingCoins applies a fee of 2% (approximately 0.18% effective fee) to its delegators. Communication For delegators of StakingCoins, there is a dedicated Telegram group (@Radix_StakingCoins), where a bot automatically informs members of any issues with the validator node. In addition to this, Marco can be contacted through Discord at rigel#5123 and is active on Twitter under the handle @Radix_SC. ## StakeSafe URL: https://radix.wiki/ecosystem/stakesafe Updated: 2026-02-06 Summary: StakeSafe is a product that is designed to provide high-performance nodes, secure staking, and professional support for the decentralized Radix ecosystem. The p StakeSafe is a product that is designed to provide high-performance nodes, secure staking, and professional support for the decentralized Radix ecosystem. The product's focus is on ensuring the efficient and secure operation of the Radix network through the provision of reliable nodes that enable secure staking and provide a dedicated support team that can help users with any issues that they might encounter. By using StakeSafe, users on the Radix network can benefit from reliable and high-performance nodes that provide them with fast transaction processing speeds and low latency. Additionally, the product also enables users to securely stake their tokens while maintaining control over their assets, providing them with greater security and peace of mind. Overall, StakeSafe is a product that aims to empower users and contribute to the growth and success of the Radix ecosystem by providing cutting-edge infrastructure and support that enables users to transact on the network with confidence and ease. Mission The mission of StakeSafe is to empower and contribute to the decentralized Radix ecosystem by providing high-quality infrastructure, secure staking services, and professional support. Benefits Delegators can benefit from staking with StakeSafe for several reasons: Enhanced Security StakeSafe prioritizes secure staking, ensuring that delegators' assets are protected. By utilizing secure staking mechanisms, delegators can have peace of mind knowing that their tokens are being staked in a safe and reliable manner. Reliable Infrastructure StakeSafe provides high-performance node, which are essential for efficient and effective staking. Delegators can rely on StakeSafe's robust infrastructure to ensure smooth and uninterrupted staking operations. Dedicated Professional Support StakeSafe offers dedicated support from a team of professionals. This support can be invaluable for delegators who may have questions, need assistance, or encounter any issues during the staking process. The responsive and knowledgeable support team can provide timely help and ensure a positive staking experience. Active Participation in the Radix Ecosystem By staking with StakeSafe, delegators actively contribute to the decentralized Radix ecosystem. Their participation and stake help secure the network while potentially earning rewards in the form of incentives or staking rewards. Source: StakeSafe: Empowering a decentralized Radix ecosystem (https://www.stakesafe.net/) Impact StakeSafe can have several impacts on users and the wider ecosystem. Here are a few potential impacts of using StakeSafe: Staking Incentives StakeSafe allows users to stake their tokens and participate in the network's consensus protocol. By doing so, users can potentially earn rewards or incentives for contributing to the network's security and operation. This can be a significant incentive for users to actively participate in the ecosystem. Network Security Staking services like StakeSafe play a crucial role in maintaining the security and integrity of the network. By staking their tokens, users contribute to the network's decentralization and consensus process, making it more resilient to attacks or malicious activities. Ecosystem Growth Delegators using StakeSafe contribute to the health and growth of the ecosystem. By actively participating in the network, users help ensure its stability and attract more participants, developers, and businesses to the ecosystem. This increased activity and engagement could lead to a flourishing Radix network. User Convenience StakeSafe aims to provide a user-friendly staking experience, simplifying the process for users who may not have the technical expertise or infrastructure to stake directly. This convenience can encourage more users to participate and stake their tokens, boosting overall network participation and security. ## StakeBros URL: https://radix.wiki/ecosystem/stakebros Updated: 2026-02-06 Summary: StakeBroz is a validator group on the Radix network. Validator groups like StakeBroz play a crucial role in validating transactions and maintaining the security StakeBroz is a validator group on the Radix network. Validator groups like StakeBroz play a crucial role in validating transactions and maintaining the security and integrity of the blockchain network. Benefits There are several potential benefits (https://stakebros.info/) to delegating stake to the StakeBros community validator node on the Radix network. Here are some potential benefits of joining StakeBros as a staker: Earn rewards By delegating stake to StakeBros, stakers can earn a share of the 300,000,000 XRD network emissions as rewards for validating transactions and securing the Radix network. Powerful servers According to StakeBros, their validator nodes are powered by 2nd-gen AMD EPYC CPUs, 64GB RAM, and enterprise NVMe SSDs, which provide robust processing power and data storage capabilities to ensure fast and efficient transaction processing. Decentralization StakeBros has chosen a hosting provider, Stratode, which is located in Germany and is not currently used by any other top 100 validator on the Radix network. This decision aims to further strengthen network decentralization and ensure a more robust and distributed validation mechanism. Safe and secure StakeBros follows DevSecOps principles to ensure its operations and monitoring are robust and secure. They also run a full backup node to ensure quick failover and high uptime in case of any issues. Industry experience StakeBros consists of two brothers who have over 30 years of combined experience working in IT and DevOps. As crypto enthusiasts, they are well-equipped to handle validator node operations and provide high-quality service. Support network growth By choosing to delegate stake to StakeBros, stakers are supporting the growth and decentralization of the Radix network, which is seen as the future of decentralized finance (DeFi) by the StakeBros team. Staking Process Here are the steps provided (https://stakebros.info/) by StakeBros: - Open your Radix Desktop wallet. - Go to the Stake/Unstake tab. - Enter the StakeBros validator address (rv1q087r7c4fhzhvjgq665jc8re9lwjvgamx6qrdl436r92wfuw3hz6x4lp06u) and choose the amount you want to stake. - Click the stake button to start the transaction. ## Stabilis URL: https://radix.wiki/ecosystem/stabilis Updated: 2026-02-06 Summary: Stabilis is a decentralized financial ecosystem built on the Radix DLT platform, comprising two main components: the ILIS DAO and the STAB Protocol. The ecosyst Stabilis (https://docs.ilikeitstable.com/introduction/protocol-overview) is a decentralized financial ecosystem built on the Radix DLT (https://www.radixdlt.com/) platform, comprising two main components: the ILIS DAO and the STAB Protocol. The ecosystem aims to create stability within the Radix DeFi landscape through algorithmic mechanisms and community governance. Overview The Stabilis ecosystem was developed to address the need for stable assets within the Radix DLT network. At its core, the system consists of two main components (https://docs.ilikeitstable.com/introduction/protocol-overview) : the ILIS DAO (I Like It Stable DAO) and the STAB Protocol. The ILIS DAO serves as the governing entity, while the STAB Protocol provides the technical infrastructure for maintaining a stable asset called $STAB. ILIS DAO The ILIS DAO is incorporated as a non-profit Decentralized Autonomous Organization (https://docs.ilikeitstable.com/introduction/protocol-overview/ilis-dao) in the Marshall Islands under the legal name "I Like It Stable DAO LLC." The organization operates under the Marshall Islands DAO Act and is structured to ensure transparent, community-driven governance of the Stabilis ecosystem. The DAO's primary mission is to build, manage, and support open-source software for digital stable assets on the Radix DLT platform. STAB Protocol The STAB Protocol represents the first protocol governed by the ILIS DAO (https://docs.ilikeitstable.com/introduction/protocol-overview/stab-protocol) , designed to provide a stable asset ($STAB) for the Radix ecosystem. Unlike traditional stablecoins that maintain a rigid peg to fiat currencies, STAB employs a unique interest rate mechanism that allows for dynamic price adjustment based on market conditions. This approach aims to achieve what the protocol terms "fair-priced stability." Development History The Stabilis ecosystem was founded by a developer known as Octopus (https://docs.ilikeitstable.com/ilis-dao/founders) , who has a background in Physics and an interest in complex systems. The project began as a personal endeavor to explore the possibilities of the Radix platform and its Scrypto programming language, eventually evolving into a full-fledged ecosystem. The founder emphasizes a meritocratic approach to development, focusing on practical results over individual recognition. The ecosystem officially launched with the incorporation of the ILIS DAO on August 22, 2024 (https://docs.ilikeitstable.com/introduction/protocol-overview) , marking a significant milestone in the development of decentralized financial infrastructure on the Radix network. Technical Infrastructure The Stabilis ecosystem leverages several innovative technical features to achieve its goals. The STAB Protocol utilizes a collateralized debt position (CDP) model (https://docs.ilikeitstable.com/introduction/protocol-overview/stab-protocol) where users can borrow $STAB tokens by providing collateral. The system maintains stability through a combination of: - A dynamic interest rate mechanism that adjusts based on market supply and demand. - A minimum collateralization ratio (MCR) of 150%. - Liquidation mechanisms to ensure system solvency. - Flash loan capabilities for advanced DeFi operations. The ecosystem's smart contracts are implemented using Radix's Scrypto programming language, with key components including the Stabilis Proxy Component (https://docs.ilikeitstable.com/stab-protocol/technical-info/components-and-manifests) for advanced operations and various resource-oriented tokens for governance and liquidity provision. ILIS DAO Organization The ILIS DAO operates as a non-profit limited liability company incorporated in the Marshall Islands. The organization's primary purpose, as stated in its Operating Agreement, is to "build, manage, and support open-source software in the field of digital stable assets for the Radix DLT, with the goal of fostering transparent, fair, and accountable systems that promote community-driven initiatives and societal benefits, without the aim of profit generation." (https://docs.ilikeitstable.com/introduction/protocol-overview/ilis-dao) The DAO's legal structure is governed by multiple Marshall Islands laws, including the Limited Liability Company Act of 1996, the Non-Profit Entities (Amendment) Act of 2021, the Business Corporations Act, and the Decentralized Autonomous Organization Act of 2022. This legal framework provides the organization with a fit-for-purpose regime designed to protect members while supporting innovation in the blockchain space. Under its non-profit status (https://docs.ilikeitstable.com/introduction/protocol-overview) , the DAO is restricted from: - Allowing net earnings to benefit any individual. - Exclusively engaging in legislative influence. - Participating in political campaigns. - Operating primarily as a for-profit business. Governance Structure The ILIS DAO employs a token-based governance system where decisions are made and executed on-chain whenever possible (https://docs.ilikeitstable.com/introduction/protocol-overview/ilis-dao) . The governance process involves members voting on proposals through a dedicated Voting Smart Contract (https://docs.ilikeitstable.com/ilis-dao/using-ilis-dao/governance) , with voting power determined by the number of ILIS tokens held in a member's staking position. Membership System Membership in the ILIS DAO is obtained through the acquisition and staking of ILIS tokens (https://docs.ilikeitstable.com/ilis-dao/using-ilis-dao/membership) . The membership system includes several key features: - Members must create a Membership ID. - ILIS tokens can be staked to the Membership ID. - Staked tokens provide voting rights and staking rewards. - Members can optionally lock their tokens for extended periods to receive additional rewards. - A 7-day unstaking period applies when members wish to withdraw their tokens. Proposal System The DAO's governance operates through a proposal system where members can submit and vote on various initiatives (https://docs.ilikeitstable.com/ilis-dao/using-ilis-dao/governance) . Each proposal includes: - A title and description. - Detailed specifications. - Attachments with additional information. - A voting deadline. - A quorum requirement. Once a member votes on a proposal, their staked tokens are locked until the voting period ends, preventing double voting and ensuring governance integrity. Token Distribution and Economics The ILIS DAO utilizes a Liquidity Bootstrapping Pool (LBP) mechanism (https://docs.ilikeitstable.com/ilis-dao/technical-info/liquidity-bootstrapping-pools) for its initial token distribution. This approach aims to achieve both fair distribution and sufficient liquidity for the $ILIS token. The LBP works by: - Starting with an intentionally high token price that gradually decreases. - Discouraging early token accumulation by major buyers. - Creating a "fair pricing game" where participants attempt to time their purchases optimally. - Automatically adjusting token weights over time to influence price dynamics. Incentive Structure The DAO maintains an incentive system to reward actions that benefit the Stabilis ecosystem (https://docs.ilikeitstable.com/ilis-dao/using-ilis-dao/incentives) . Key features of the incentive system include: - Weekly ILIS token rewards for specific beneficial actions. - Staking requirements for receiving rewards. - Optional token locking for enhanced rewards. - Manual claiming of rewards within a 5-week window. - Incentivized resources, such as $LPSTAB tokens for liquidity provision. Technical Implementation The DAO's technical infrastructure is built on the Radix DLT platform and includes several key smart contracts: - A Voting Token Smart Contract for membership management (https://docs.ilikeitstable.com/ilis-dao/using-ilis-dao/membership) . - A Voting Smart Contract for proposal processing (https://docs.ilikeitstable.com/ilis-dao/using-ilis-dao/governance) . - Incentive management components for reward distribution (https://docs.ilikeitstable.com/ilis-dao/using-ilis-dao/incentives) . The DAO maintains its official online presence at ilikeitstable.com (https://docs.ilikeitstable.com/introduction/protocol-overview) , where members can access governance features, documentation, and participate in DAO activities. STAB Protocol Overview The STAB Protocol is the first protocol developed under the governance of the ILIS DAO, designed to provide a stable asset ($STAB) for the Radix DLT ecosystem. Unlike traditional stablecoins, STAB employs a dynamic interest rate mechanism that allows its value to fluctuate within controlled parameters (https://docs.ilikeitstable.com/introduction/protocol-overview/stab-protocol) , aiming to achieve what the protocol terms "fair-priced stability." Technical Architecture Loan System The STAB Protocol utilizes a collateralized borrowing system similar to other decentralized stablecoin protocols (https://docs.ilikeitstable.com/stab-protocol/using-stab-protocol/borrow) . Users can create $STAB by providing collateral and opening a loan position. The system maintains several key parameters: - A Minimum Collateralization Ratio (MCR) of 150% for all collateral types. - A total liquidation penalty of 15%, with 10% going to liquidators and 5% to the protocol (https://docs.ilikeitstable.com/stab-protocol/technical-info/system-parameters) . - Currently, only XRD (the native token of Radix) is accepted as collateral. Loan management features (https://docs.ilikeitstable.com/stab-protocol/using-stab-protocol/manage-loans) allow users to: - Add or remove collateral from existing loans. - Borrow additional $STAB against existing collateral. - Partially or fully repay outstanding loans. - Monitor loan health and collateralization ratios. Stability Mechanisms The STAB Protocol employs several mechanisms to maintain price stability: - Interest Rate Controller The protocol uses a dynamic interest rate system (https://docs.ilikeitstable.com/stab-protocol/technical-info/interest-rate) that influences $STAB's internal price. The interest rate: - Can range from -33.33% to +50% annually. - Adjusts based on market supply and demand. - Influences borrowing and holding incentives. - Uses a PID controller for precise adjustments. The interest rate calculation (https://docs.ilikeitstable.com/stab-protocol/technical-info/interest-rate) follows the formula: interest_rate -= (kp * (price_error / internal_price) + ki * (average_of_last_50_prices / internal_price))) * passed_minutes where kp and ki are controller constants, and price_error is the difference between market price and internal price. - Liquidation System The protocol includes a two-step liquidation process (https://docs.ilikeitstable.com/stab-protocol/using-stab-protocol/liquidations) to maintain system solvency: - Marking Phase: Liquidators first mark unhealthy loans for liquidation. - Execution Phase: After a 5-minute delay, the marker can liquidate the loan. - Public Liquidation: If the marker doesn't act within 10 minutes, any user can liquidate the marked loan. - Additional Stability Features The protocol includes redemptions and forced minting mechanisms (https://docs.ilikeitstable.com/stab-protocol/technical-info/components-and-manifests) that provide additional stability tools: - Redemptions allow for forced liquidation of the lowest-collateralized loans. - Forced minting enables additional $STAB creation from highly-collateralized loans. - Flash loans are available for advanced trading strategies. Liquidity System The protocol maintains a dedicated XRD/STAB liquidity pool (https://docs.ilikeitstable.com/stab-protocol/using-stab-protocol/swap) where users can: - Swap between $XRD and $STAB tokens. - Provide liquidity and receive $LPSTAB tokens. - Monitor real-time trading metrics and pool statistics. The system calculates a Real 7-day APY (https://docs.ilikeitstable.com/stab-protocol/technical-info/real-7d-apy) for liquidity providers using the formula: 7_day_real_apy = ((lp_valuation_now / lp_valuation_7_days_ago)^(365 / 7) - 1) * 100 This calculation uses current asset prices for both time periods to provide a fair comparison between holding assets and providing liquidity. Risk Factors The STAB Protocol documentation identifies several key risks (https://docs.ilikeitstable.com/stab-protocol/dangers) : - Smart Contract Risk: Potential vulnerabilities in the protocol's code could lead to stuck or drained resources. - Liquidation Risk: Loans may face liquidation if their collateralization ratio falls below the MCR, resulting in penalty fees. - Oracle Risk: The potential for oracle manipulation could lead to inaccurate price feeds and improper system actions. - Depeg Risk: Market conditions might cause $STAB to trade significantly away from its target price, potentially making loan repayment or liquidation difficult. These risks are actively monitored and managed through the protocol's governance mechanisms and technical features. Technology Core Components - Voting Infrastructure - The Voting Token Smart Contract (https://docs.ilikeitstable.com/introduction/protocol-overview) , located at resource_rdx1t4r86qqjtzl8620ahvsxuxaf366s6rf6cpy24psdkmrlkdqvzn47c2, manages the ILIS governance token system. - A dedicated Voting Smart Contract (https://docs.ilikeitstable.com/introduction/protocol-overview) at component_rdx1cpzm6rgdwgw9p075zsh5kfjuxa9rzzyt47x6xcgzhzydp9lkymyx78 handles proposal processing and vote counting. - Stability Mechanisms - The Stabilis Proxy Component (https://docs.ilikeitstable.com/stab-protocol/technical-info/components-and-manifests) manages advanced operations including: - Flash loan functionality. - Redemption operations. - Forced minting capabilities. - A PID controller implementation (https://docs.ilikeitstable.com/stab-protocol/technical-info/interest-rate) manages the dynamic interest rate system. Resource Tokens The ecosystem utilizes several resource-oriented tokens: - $ILIS Token - Functions as the governance token (https://docs.ilikeitstable.com/introduction/protocol-overview/ilis-dao) . - Represents membership in the DAO. - Used for voting on proposals. - Can be staked for rewards. - $STAB Token - The protocol's stable asset (https://docs.ilikeitstable.com/introduction/protocol-overview/stab-protocol) . - Created through collateralized borrowing. - Subject to dynamic interest rates. - Used in liquidity pools. - $LPSTAB Token - Represents liquidity provided to the $XRD/$STAB pool (https://docs.ilikeitstable.com/stab-protocol/using-stab-protocol/swap) . - Tracks liquidity provider positions. - Used for claiming trading fees and rewards. Technical Features Automated Market Maker (AMM) The protocol includes a built-in AMM (https://docs.ilikeitstable.com/introduction/protocol-overview/stab-protocol) that: - Facilitates $XRD/$STAB trading. - Provides price information for the interest rate controller. - Supports liquidity provision and removal. - Calculates real-time trading metrics. Liquidation Engine The protocol's liquidation system (https://docs.ilikeitstable.com/stab-protocol/using-stab-protocol/liquidations) implements a unique two-phase liquidation process: - Marking Phase - Allows liquidators to identify unhealthy positions. - Implements a 5-minute delay before execution. - Issues Marker Receipts to track liquidation rights. - Execution Phase - Processes liquidations according to protocol parameters. - Distributes liquidation penalties between liquidators and the protocol. - Includes fallback mechanisms for uncompleted liquidations. Flash Loan Implementation The system supports flash loans through the Stabilis Proxy Component (https://docs.ilikeitstable.com/stab-protocol/technical-info/components-and-manifests) , with operations including: - Atomic borrowing and repayment. - Integration with other DeFi protocols. - Transaction validation and revertion mechanisms. Development Tools and Resources The Stabilis ecosystem maintains several technical resources: - Documentation - Comprehensive protocol documentation (https://docs.ilikeitstable.com/introduction/protocol-overview) . - Technical specifications and parameters. - Integration guides and examples. - Smart Contract Interfaces - Component manifests (https://docs.ilikeitstable.com/stab-protocol/technical-info/components-and-manifests) . - Flash loan templates. - Interaction examples. - Development Environment - Built on Radix's Scrypto language. - Utilizes Radix Engine v2. - Leverages Radix's component-based architecture. See Also - Radix DLT (https://www.radixdlt.com/) - The underlying blockchain platform on which Stabilis is built - Decentralized Autonomous Organizations (https://en.wikipedia.org/wiki/Decentralized_autonomous_organization) - The organizational structure employed by ILIS DAO - Algorithmic stablecoins (https://en.wikipedia.org/wiki/Stablecoin#Algorithmic_stablecoins) - The category of digital assets to which STAB belongs - MakerDAO (https://en.wikipedia.org/wiki/MakerDAO) - A similar protocol that also uses collateralized debt positions - PID Controllers (https://en.wikipedia.org/wiki/PID_controller) - The control system used in STAB's interest rate mechanism References - "Protocol Overview - Stabilis Ecosystem" (https://docs.ilikeitstable.com/introduction/protocol-overview) . ILIS DAO. - "STAB Protocol for Dummies" (https://docs.ilikeitstable.com/introduction/protocol-overview/stab-protocol) . ILIS DAO. - "ILIS DAO Documentation" (https://docs.ilikeitstable.com/introduction/protocol-overview/ilis-dao) . ILIS DAO. - "System Parameters" (https://docs.ilikeitstable.com/stab-protocol/technical-info/system-parameters) . ILIS DAO. - "Interest Rate Documentation" (https://docs.ilikeitstable.com/stab-protocol/technical-info/interest-rate) . ILIS DAO. - "Dangers" (https://docs.ilikeitstable.com/stab-protocol/dangers) . ILIS DAO. - "Real 7d APY" (https://docs.ilikeitstable.com/stab-protocol/technical-info/real-7d-apy) . ILIS DAO. - "Components & Manifests" (https://docs.ilikeitstable.com/stab-protocol/technical-info/components-and-manifests) . ILIS DAO. - "Operating Agreement of I Like It Stable DAO LLC" (https://docs.ilikeitstable.com/introduction/protocol-overview) . ILIS DAO. ## Staatenlos Node URL: https://radix.wiki/ecosystem/staatenlos-node Updated: 2026-02-06 Summary: Staatenlos Node is Radix validator node. The team consists of experienced professionals in software technology, cybersecurity, and finance. They aim to contribu Staatenlos Node is Radix validator node. The team consists of experienced professionals in software technology, cybersecurity, and finance. They aim to contribute to the decentralization of the Radix network and promote security and stability in the ecosystem. https://youtu.be/C5Yd_esWuN4?si=TnI7gci1YpVXcedG (https://youtu.be/C5Yd_esWuN4?si=TnI7gci1YpVXcedG) The Staatenlos Node team states that they are committed to ensuring high availability and reliable operation of their Radix node. They have implemented monitoring and backup systems to ensure that their node remains operational even in the event of hardware or network failures. Staatenlos Node also provides a user-friendly dashboard that allows Radix token holders to delegate their tokens to their validator node and receive staking rewards. This process is made easy and accessible to everyone, regardless of technical expertise. Overall, Staatenlos Node aims to provide a trustworthy and reliable service to the Radix community and contribute to building a more decentralized and secure network. If you require further information or have additional queries, it is recommended to explore their website or contact them directly. Team The team behind Staatenlos Node consists of various professionals with expertise in different fields, including software development, cybersecurity, and finance. Hans Looman Hans Looman is the founder of Staatenlos Node and a well-known figure in the crypto space. He has expertise in financial privacy and offshore banking. Matt Hamilton Matt Hamilton is the CTO of Radix, and he has been closely involved with the Staatenlos Node project. He is responsible for the technical aspects of the Radix network. Sander de Bruijn Sander de Bruijn is a developer with Staatenlos Node and has expertise in software development and cryptography. Staking Process The staking process for Staatenlos Node validator node is relatively straightforward and can be done in a few simple steps: Buy Radix Tokens: First, you need to purchase Radix tokens (XRD) in a cryptocurrency exchange that supports them. Create a Wallet Next, you will need to create a Radix wallet to store your tokens. You can use an existing wallet that supports Radix, such as MyRadixWallet or Radix Desktop Wallet, or create a new one. Delegate Tokens Once you have purchased and transferred your Radix tokens to your wallet, you can now delegate them to the Staatenlos Node validator node. You can delegate your tokens to the node by specifying the Staatenlos Node address as the recipient of your delegation. The address can be found on their website or by contacting them directly. Earn Rewards As a token holder, you will start earning staking rewards for delegating tokens to the node. The rewards may vary based on the number of tokens you delegate, the length of time you hold them, and other factors. The rewards are distributed periodically and can be automatically claimed and reinvested back into staking to maximize returns. ## SRWA URL: https://radix.wiki/ecosystem/srwa Updated: 2026-02-06 Summary: SRWA is a platform built on Radix to facilitate lending and borrowing with real-world assets serving as collateral, analogous to houses in mortgages or gold in SRWA is a platform built on Radix to facilitate lending and borrowing with real-world assets serving as collateral, analogous to houses in mortgages or gold in traditional financing. SRWA promises enhanced security and user experience, prerequisites for extensive adoption. Runs on Radix Q&A: SRWA (https://youtu.be/UfKgtYBgpRs) Runs on Radix Q&A: SRWA Products Ploughshare Main article: Ploughshare (Ploughshare%20a17c49e48ed741698407de4c19fb8204.md) Lending Protocol SRWA.io (http://SRWA.io) will permit users to deposit selected tokens (e.g., $XRD, $MOO) and earn interest. Simultaneously, they can borrow these tokens against a specified interest. The accrued interest is split between the Ploughshare project and depositors at a predefined rate. Both lenders and borrowers will be incentivized by the $MOO token distribution. The team aims to incorporate NFT capability for representing real-world assets in future updates. Ecosystem and Partnerships The present design envisions a two-fold ecosystem with Ploughshare and SRWA. While Ploughshare is poised to evolve into a DAO (Decentralized Autonomous Organization), SRWA contributes as the technological team. Anticipated partnerships encompass Liquidity Providers, DEX, CEX, Asset Originators, Auditors, Oracles, Stablecoin Protocols, Identity & KYC, Fiat on/off-ramp, Bridges, and Multisig solutions. Real-World Assets (RWA) Integration Central to SRWA's value proposition is the tokenization of RWAs, enabling users to collateralize assets like farms or machinery. This is facilitated via a Special Purpose Vehicle (SPV) created to register and tokenize such assets. Subsequently, an NFT representing the asset is generated, and its owner can then demonstrate their collateral on the chain. Decentralized Governance The ultimate objective is to transition all decision-making to decentralized governance. While the specifics remain under discussion, an initial Treasury, earmarked for DAO operations, will be established coinciding with the token launch. Initial Capital and Radix Network Support SRWA's initial capital, denominated in XRD, is earmarked for future staking to fortify the entire Radix network. While building Validating nodes and Gateway API may not be central to Ploughshare's mission, SRWA envisions developing and operating a dedicated Full Node as a learning exercise. History The inspiration for SRWA came from the Radix Scrypto DeFi Challenge. The SRWA team engaged in an in-depth exploration of DeFi lending protocols, resulting in a Compound-like lending mechanism. This system permits assets to function as a lending pool for borrowers and as collateral. Their demo, which can be tested with a resim simulator, mirrors this. Nicola, one of the co-founders, embarked on his journey into the blockchain world around 2016-2017. Initially drawn to the potential of blockchain not just for native assets like Bitcoin and Ethereum, Nicola's vision extended to applying the technology to real-world assets. Early experiments involved assets such as gold. As the project idea matured over time, the team identified inefficiencies in traditional financial systems. They were particularly struck by the disparity in time taken to raise credit traditionally, citing an experience where it took them four and a half months for a credit line, versus an hour and a half using blockchain technology. Team The SRWA project boasts a diverse and experienced team. Key members include: - Nikola Sologub (https://www.notion.so/Nikola-Sologub-4a349b9649054a53b9f8da91600e5838?pvs=21) : An early enthusiast of blockchain and DLTs, Nikola has been pivotal in steering the direction of the project. - University Professor: A co-founder responsible for the protocol development, with a team skilled in working on various networks including Cosmos and Polygon. - Additional co-founders bring expertise in decentralized finance (DeFi) and DApp (decentralized application) development. The project took shape about a year and a half before the present day. A significant milestone was their discovery of Radix in December, which they found to be a crucial solution to enhancing user experience, a challenge they faced throughout their project's development. Future Developments SRWA plans to unveil a Minimum Viable Product (MVP) - a production-ready application running on a test network. This MVP will allow early adopters to experiment and provide feedback. Upon stabilization of the mainnet, SRWA aims to launch their project in full. Collaboration and Future Plans SRWA endorses open-source methodologies and is particularly interested in Radix Royalty mechanics. Collaborations with DEXes, Oracles, Stablecoins, and other entities are anticipated. Their philosophy prioritizes integration over re-invention wherever effective components exist. ## Slightlyiffy URL: https://radix.wiki/ecosystem/slightlyiffy Updated: 2026-02-06 Summary: Slightlyiffy is a cryptocurrency consultancy and validator operation founded by Slightlyiffy. The company offers services and expertise to individuals and organ Slightlyiffy is a cryptocurrency consultancy and validator operation founded by Slightlyiffy (https://www.notion.so/Slightlyiffy-678aacf710e64a3da321112aa1cd3eea?pvs=21) . The company offers services and expertise to individuals and organizations seeking guidance and evaluations on various cryptocurrency projects. They aim to provide unbiased feedback and analysis to help clients make informed investment decisions. Overview The consultancy specializes in assessing the potential of different cryptocurrency projects, providing clients with a second opinion, deep dive reports, and feedback to validate their investment choices. Whether it's evaluating the feasibility of a new project, comparing alternative cryptocurrencies, or discussing industry developments, Slightlyiffy aims to support individuals and businesses in understanding the cryptocurrency landscape. Validator Status Slightlyiffy operates as a validator node and backup node for the Radix Distributed Ledger Technology (DLT) mainnet. The company actively participates in the delegated Proof-of-Stake (dPoS) consensus mechanism, securing the Radix network. Users are welcome to stake their tokens with Slightlyiffy to support network security. The company also provides basic metric monitoring for its validator node, allowing stakeholders to track its performance and rewards generated. Services and Offerings Apart from their core consultancy services, Slightlyiffy aims to contribute to the cryptocurrency community through various activities. They leverage their spare time and resources to develop simple Radix-themed games for the community's enjoyment. This demonstrates the company's commitment to engaging with the community and fostering a thriving ecosystem. ## SingularityX URL: https://radix.wiki/ecosystem/singularityx Updated: 2026-02-06 Summary: SingularityX is also associated with a meme token called $SINX. It is described as a token fueled by respect for technology and the Radix ecosystem. The goal of SingularityX is also associated with a meme token called $SINX. It is described as a token fueled by respect for technology and the Radix ecosystem. The goal of SingularityX is to build a community that collaborates on the development of science. Overview SingularityX is described as a group of passionate crypto enthusiasts and Radix enthusiasts who are active community members. They have created their own MEME token called SingularityX (SINX), which is not fueled by humor but by respect for technology and the Radix ecosystem. Roadmap Here is a breakdown of the Roadmap (https://www.singularityx.net/) for SingularityX: 2022 Q3 Start SingularityX Create token Create website Create Twitter and Telegram channels Airdrop Listing on DogeCubeX Listing on Ociswap 2022 Q4 Development Create NFT Twitter and Telegram channels Website development Marketing Create NFT collection 2023 Q1 Focused on NFT Marketing Create SSN NFT website Launch of SSN collection SSN collection moved to Radish Square 2023 Q2 Focused on Charity Campaign with weekly airdrops Charity project VOTE Website development: Z3us Xidar wallet connection Start of NFT marketplace creation 2023 Q3 Focused on Game Game development starts with SSN NFTs Marketing Create Celestial Bodies NFT website Launch of Celestial Bodies collection 2023 Q4 Where to go Create Swap Development of possible NODE run Tokenomics Here is a breakdown of the Tokenomics (https://www.singularityx.net/) of SingularityX (SINX) in percentage allocations: • Charity: 20% • Community: 20% • Marketing: 15% • Liquidity: 35% • Team: 10% • The total supply of SINX tokens is 49,000,000 Mission SingularityX's stated goal is to build (https://www.singularityx.net/) a community that collaborates on the development of science. They have plans for a charitable donation supporting scientific research, particularly in the areas of web3 and space research. Additionally, SingularityX has plans to create an NFT collection. Partnerships • Ociswap (/ecosystem/ociswap) • DogeCube (/ecosystem/dogecubex) • Radit.io (/ecosystem/radit) • DSOR • Astrolescent (/ecosystem/astrolescent) ## ShardSpace URL: https://radix.wiki/ecosystem/shardspace Updated: 2026-02-06 Summary: ShardSpace is an asset and portfolio management platform designed to integrate with the Radix Wallet, extending its capabilities to desktop interfaces. The proj ShardSpace is an asset and portfolio management platform designed to integrate with the Radix Wallet, extending its capabilities to desktop interfaces. The project is aimed at simplifying asset management within the Radix ecosystem by offering a unified dashboard for executing various cryptocurrency transactions including buying, swapping, sending, and staking. https://youtu.be/8Y4071ytnPQ?si=0pQca1FIqCG0CJ03 (https://youtu.be/8Y4071ytnPQ?si=0pQca1FIqCG0CJ03) Overview ShardSpace was conceived with the objective of leveraging the technological prowess of the Radix stack to create a comprehensive tool catering to a diverse user base comprising individuals, businesses, and developers. The platform augments the Radix mobile wallet by offering an array of features such as seamless token transfers, swapping, and liquid staking options, accessible from desktop interfaces. History ShardSpace was founded by Andrew Rankine, founder of Avaunt (Avaunt%20Staking%2075a143199bf5472ca5c9cbf72e7239df.md) . Initially involved in the Radix ecosystem as a Node Runner, Andrew envisioned ShardSpace as a tool to provide a better interface for asset management on Radix. With his background in IT infrastructure, Andrew gathered a team with diverse experiences to bring the ShardSpace vision to life. ShardSpace's initial focus was the Radix Dashboard, which provided data for validators. However, recognizing the demand for a more comprehensive tool led to the evolution of ShardSpace. Development The project has been under development with a focus on creating a user-friendly dashboard that would replace the original website Radix dashboard which was primarily a static informational analytics site. Unlike its predecessor, ShardSpace is engineered to empower users with the ability to perform a variety of transactions from the convenience of their desktops. The platform's development is being spearheaded by Andrew, known as Avant within the Radix community, who transitioned from being a node validator to taking on a more front-end and service-oriented role within the ecosystem. Features - Token Swapping: Users can easily swap between various Radix ecosystem tokens. - Token Management: ShardSpace facilitates easy management of Radix ecosystem tokens, enabling users to swap between different tokens using either Limit Orders or Market Orders. - Trading Features: In collaboration with AlphaDex and Astrolescent, ShardSpace has incorporated Limit Order and Market Order trading functionalities respectively. - Liquid Staking: Beyond asset management, ShardSpace offers Liquid Staking options, enabling users to choose between Validator Staking and Stake Pool opportunities. - Developer Tools: The platform provides tools for both simple and advanced token creation as well as an Airdrop feature to aid developers in their endeavors within the Radix ecosystem. Partnerships ShardSpace has partnered with AlphaDex (AlphaDEX%204f59f3ac106e40849677d249a1dc0bb3.md) for Limit Order trading and with Astrolescent (Astrolescent%206973417ff80440f2a816a1ed2465961e.md) for Market Order trading features. These partnerships aim to enrich the trading experience on the platform by offering greater control and convenience to users. Community The ShardSpace community can be engaged through its Telegram channel, where enthusiasts and prospective users can interact and stay updated on the latest developments. Additionally, the community is also active on Radix's Discord and Telegram channels where Andrew (Avant) is a notable presence. Challenges and Future Development ShardSpace faces challenges in balancing a multitude of innovative ideas with time and resources constraints. To address this, they intend to adopt a feedback-driven development strategy, emphasizing gathering insights from users and the broader community. The team has set key milestones for the project: - Closed Beta Testing: Preparation for closed beta testing phase. - MVP Launch: A release planned for early November, featuring core functionalities and tools. - Community Growth: Emphasis on community engagement and feedback. - Feature Expansion: After the MVP launch, the focus will shift to more advanced feature implementation. - User Education: Initiatives to help users maximize platform capabilities. ## RRC-404 URL: https://radix.wiki/ecosystem/rrc-404 Updated: 2026-02-06 Summary: RRC-404 is an implementation of the ERC-404 token standard on Radix, developed using the Scrypto programming language. It aims to provide fungible liquidity for RRC-404 is an implementation of the ERC-404 (https://decrypt.co/resources/what-is-erc-404-the-experimental-semi-fungible-ethereum-token-standard) token standard on Radix, developed using the Scrypto (https://www.notion.so/Scrypto-Programming-Language-2d69fe4f8bf24d6491c7af567faecaaa?pvs=21) programming language. It aims to provide fungible liquidity for non-fungible tokens (NFTs), blurring the lines between fungible and non-fungible assets. https://x.com/aus877/status/1761873683662946654 (https://x.com/aus877/status/1761873683662946654) Overview The ERC-404 standard, originally designed for the Ethereum blockchain, introduces a hybrid token type that combines aspects of both fungible (ERC-20) and non-fungible (ERC-721) tokens (https://ethereum.org/en/developers/docs/standards/) . This allows for fractional ownership of NFTs and the ability to trade them on decentralized exchanges (DEXs), providing liquidity for traditionally illiquid NFT assets. However, the ERC-404 implementation has faced criticism for creating unpredictable behaviors (https://twitter.com/aus877/status/1761535221701202261) and confusion around the mapping between fungible tokens and associated NFTs. RRC-404 takes a different approach by leveraging the native resource model (https://docs.radixdlt.com/docs) of the Radix ledger to clearly separate fungible and non-fungible states while facilitating conversion between the two. The key goal of RRC-404 is to simplify the concept of NFT liquidity and provide a more predictable and user-friendly experience (https://twitter.com/aus877/status/1761873683662946654) compared to ERC-404. By using distinct fungible and non-fungible resources, RRC-404 aims to eliminate the potential for accidental loss or burning of NFTs during token transfers, a concern raised with the ERC-404 implementation. RRC-404 Design and Features The design of RRC-404 takes a fundamentally different approach compared to ERC-404 by leveraging the native resource model of the Radix ledger. Unlike Ethereum's ERC tokens, which are defined as smart contracts, Radix uses native resources with predefined behaviors (https://docs.radixdlt.com/main/scrypto/resources.html) that resemble physical money. Distinct Fungible and Non-Fungible Resources At the core of RRC-404 is the separation of fungible and non-fungible assets (https://twitter.com/aus877/status/1761873683662946654) into distinct resource types. Instead of mapping a fungible token to every NFT as in ERC-404, RRC-404 ensures that a user can only hold either a fungible token or a non-fungible token at any given time. This design choice eliminates the potential for accidental loss or burning of NFTs during token transfers, as there is no invisible mapping that could result in unintended consequences. Conversion Between Fungible and Non-Fungible States RRC-404 introduces a dedicated smart contract component (https://github.com/aus87/ice_rrc404v1) that facilitates the conversion between fungible and non-fungible states. Users can interact with this component to "freeze" their fungible tokens into NFTs or "melt" their NFTs back into fungible tokens. This conversion process is designed to be predictable and user-controlled, allowing holders to freely trade or transfer their assets in either state without worrying about unintended consequences. Rarity Percentage Targets Unlike ERC-404, which uses a hash function to determine the rarity of minted NFTs, RRC-404 introduces a novel approach based on rarity percentage targets (https://twitter.com/aus877/status/1761535221701202261) . The smart contract logic aims to maintain a predefined percentage distribution for each rarity level (e.g., 42% green, 30% blue, 16% red, etc.). When minting new NFTs, the contract prioritizes the rarity level that has deviated the most from its target percentage in the current circulating supply. This approach provides a degree of transparency and predictability regarding the distribution of rarities. Time Delays and Royalties To discourage frequent "rerolling" of NFTs, RRC-404 implements a time delay mechanism (https://twitter.com/aus877/status/1761873683662946654) . Once an NFT is minted, there is a predefined period (e.g., 4 hours) during which it cannot be melted back into a fungible token. This introduces a cooldown period and encourages more intentional conversions. Additionally, RRC-404 incorporates a small royalty fee (https://github.com/aus87/ice_rrc404v1) (e.g., 1 $XRD) that is collected whenever a user performs a conversion between fungible and non-fungible states. This royalty mechanism provides a potential revenue stream for the creators or maintainers of the RRC-404 contract. How RRC-404 Works The core functionality of RRC-404 revolves around two main methods: Freeze and Melt. These methods allow users to convert between the fungible and non-fungible states of the token. Freeze: Converting Fungible Tokens to NFTs The Freeze method enables users to convert their fungible tokens into non-fungible tokens (NFTs). To initiate this process, the user must first withdraw the desired amount of fungible tokens from their wallet into a separate bucket. Once the fungible tokens are in the bucket, the user can call the Freeze method on the RRC-404 smart contract component, passing the bucket as an argument. The component then performs the following steps: - Checks the total supply of fungible tokens and NFTs to ensure that minting new NFTs will not exceed the maximum supply. - Determines the rarity level of the new NFTs based on the current circulating supply and the predefined rarity percentage targets. - Mints the appropriate number of new NFTs with the corresponding rarity metadata. - Burns the fungible tokens from the provided bucket. - Deposits the newly minted NFTs into the user's wallet. Example code snippet for calling the Freeze method: CALL_METHOD Address("your_address") "withdraw" Address("fungible_token_address") Decimal("amount") TAKE_ALL_FROM_WORKTOP Address("fungible_token_address") Bucket("fungible_bucket") CALL_METHOD Address("rrc404_component_address") "freeze" Bucket("fungible_bucket") CALL_METHOD Address("your_address") "deposit_batch" Expression("ENTIRE_WORKTOP") Melt: Converting NFTs to Fungible Tokens The Melt method allows users to convert their NFTs back into fungible tokens. There are two variations of this method: - Melt by Amount: Users can withdraw a specific number of NFTs from their wallet and melt them into fungible tokens. - Melt by IDs: Users can selectively withdraw specific NFTs by their unique IDs and melt those NFTs into fungible tokens. To use the Melt method, the user must first withdraw the desired NFTs from their wallet into a separate bucket. Then, they can call the Melt method on the RRC-404 smart contract component, passing the bucket as an argument. The component will check if the time delay period has elapsed since the NFTs were minted (e.g., 4 hours). If the delay has passed, the component will burn the NFTs from the provided bucket and mint the corresponding amount of fungible tokens, depositing them into the user's wallet. Example code snippet for calling the Melt method by amount: CALL_METHOD Address("your_address") "withdraw" Address("nft_token_address") Decimal("amount") TAKE_ALL_FROM_WORKTOP Address("nft_token_address") Bucket("nft_bucket") CALL_METHOD Address("rrc404_component_address") "melt" Bucket("nft_bucket") CALL_METHOD Address("your_address") "deposit_batch" Expression("ENTIRE_WORKTOP") During both the Freeze and Melt processes, the RRC-404 contract collects a small royalty fee (e.g., 1 $XRD) to incentivize the creators and maintainers of the contract. By separating the fungible and non-fungible states and providing user-controlled conversion methods, RRC-404 aims to simplify the process of managing NFT liquidity while maintaining predictable and transparent behavior. Benefits and Advantages The RRC-404 implementation offers several key benefits and advantages over the ERC-404 standard on Ethereum: Simplicity and Predictability One of the primary goals of RRC-404 is to simplify the concept of providing liquidity for non-fungible tokens (NFTs). By clearly separating the fungible and non-fungible states into distinct resources (https://twitter.com/aus877/status/1761873683662946654) , RRC-404 eliminates the potential for unpredictable behaviors and accidental loss or burning of NFTs during token transfers. Users can hold and transact with either fungible tokens or NFTs without worrying about unintended consequences, as there is no invisible mapping or coupling between the two asset types. User Control and Intentionality With RRC-404, users have complete control over the conversion process (https://github.com/aus87/ice_rrc404v1) between fungible and non-fungible states. The dedicated smart contract component provides explicit methods (Freeze and Melt) for users to initiate these conversions intentionally. This level of user control ensures that changes to asset states are deliberate and well-understood, reducing the risk of accidental or unintended actions that could result in the loss of valuable NFTs. Transparency and Predictability of Rarity Distribution Unlike ERC-404, which uses a hash function to determine the rarity of minted NFTs, RRC-404 introduces a novel approach based on rarity percentage targets (https://twitter.com/aus877/status/1761535221701202261) . By aiming to maintain predefined percentage distributions for each rarity level, RRC-404 provides transparency and predictability regarding the distribution of rarities in the circulating supply. While users cannot control the exact rarity of the NFTs they mint, they can have a general understanding of the likelihood of minting a particular rarity level based on the current supply. Royalty Mechanism for Creators and Maintainers RRC-404 incorporates a small royalty fee (https://github.com/aus87/ice_rrc404v1) that is collected whenever a user performs a conversion between fungible and non-fungible states. This royalty mechanism provides a potential revenue stream for the creators and maintainers of the RRC-404 contract, incentivizing them to continue developing and supporting the standard. Time Delay for Melting NFTs To discourage frequent "rerolling" of NFTs, RRC-404 implements a time delay mechanism (https://twitter.com/aus877/status/1761873683662946654) . Once an NFT is minted, there is a predefined period (e.g., 4 hours) during which it cannot be melted back into a fungible token. This cooldown period encourages more intentional and strategic conversions, rather than rapid speculation or farming. Compatibility with Radix's Native Resource Model By leveraging the native resource model (https://docs.radixdlt.com/main/scrypto/resources.html) of the Radix ledger, RRC-404 seamlessly integrates with the underlying blockchain infrastructure. This inherent compatibility ensures that the implementation adheres to the design principles and security guarantees of the Radix ecosystem, potentially simplifying future developments and integrations. Potential Applications and Use Cases The RRC-404 implementation on the Radix blockchain presents several intriguing potential applications and use cases for enabling liquidity for non-fungible tokens (NFTs). Fractional Ownership of NFTs One of the primary motivations behind standards like ERC-404 and RRC-404 is to facilitate fractional ownership of NFTs. By allowing users to hold and trade fungible tokens representing fractional shares of an NFT, these standards open up new possibilities for investment, speculation, and shared ownership models. With RRC-404, users can convert their fungible tokens into whole NFTs or vice versa, enabling them to buy or sell fractions of NFTs as desired. This could potentially increase the accessibility and liquidity of NFT markets, attracting a wider range of investors and collectors. Liquidity for NFT Collections RRC-404 can provide much-needed liquidity for NFT collections (https://twitter.com/aus877/status/1761873683662946654) , which are traditionally illiquid assets. By allowing users to convert their NFTs into fungible tokens, RRC-404 enables these assets to be traded on decentralized exchanges (DEXs) and participate in various decentralized finance (DeFi) protocols. This increased liquidity could attract more interest and investment in NFT projects, as holders would have the ability to easily enter or exit their positions without relying solely on peer-to-peer sales or centralized marketplaces. Real-World Asset Tokenization The concept of fractional ownership and liquidity for NFTs can be extended to the tokenization of real-world assets (https://www.radixdlt.com/post/tokenizing-real-world-assets-on-radix) . RRC-404 could potentially be used to represent ownership shares in physical assets like real estate, art, or collectibles, enabling fractional investment and trading opportunities. By tokenizing these assets as NFTs and providing a mechanism for liquidity through fungible tokens, RRC-404 could open up new avenues for democratizing access to traditionally illiquid and exclusive asset classes. Gaming and Virtual Economies The gaming and virtual world industries have been early adopters of NFT technology (https://www.forbes.com/sites/jeffkauflin/2021/04/09/how-nfts-and-digital-ownership-will-revolutionize-gaming/) , using it to represent unique in-game assets, virtual land, and other digital collectibles. RRC-404 could potentially be integrated into these virtual economies, allowing players to convert their NFTs into fungible tokens and participate in various in-game marketplaces or economies. This could create new gameplay mechanics, incentive structures, and trading opportunities within virtual worlds, further blurring the lines between digital and real-world economies. Decentralized Finance (DeFi) Integrations By providing a mechanism for converting NFTs into fungible tokens, RRC-404 could enable various integrations with decentralized finance (DeFi) protocols and applications. NFTs could potentially be used as collateral for lending or borrowing, or as underlying assets for derivatives or synthetic products. This integration could unlock new use cases and financial instruments within the DeFi ecosystem, further expanding the potential applications and utility of NFTs beyond their traditional roles as collectibles or digital art. As the adoption of blockchain technology and NFTs continues to grow, the RRC-404 implementation on the Radix blockchain could play a significant role in bridging the gap between fungible and non-fungible assets, enabling new forms of ownership, investment, and liquidity in various industries and ecosystems. Future Developments and Challenges While the RRC-404 implementation on the Radix blockchain offers a promising approach to enabling liquidity for non-fungible tokens (NFTs), there are several potential areas for future development and challenges that need to be addressed. Scalability and Performance As the adoption of NFTs and tokenized assets grows, the scalability and performance of the underlying blockchain infrastructure will become increasingly important. The Radix ledger, powered by its novel Cerberus consensus protocol, aims to provide high throughput and low transaction costs. However, as demand increases, further optimizations and scaling solutions may be required to ensure efficient and cost-effective operations for RRC-404 and related applications. Interoperability and Cross-Chain Integration While RRC-404 is designed specifically for the Radix ecosystem, there may be a need for interoperability and cross-chain integration with other blockchain networks. This could involve bridging technologies or cross-chain asset transfer protocols, enabling RRC-404 tokens and NFTs to be exchanged or utilized across multiple blockchain ecosystems. Regulatory Compliance and Governance As the adoption of tokenized assets and fractional ownership models grows, regulatory bodies may introduce new guidelines or frameworks to govern these emerging asset classes. RRC-404 and its applications may need to adapt to evolving regulatory requirements, particularly in areas such as investor protection, anti-money laundering (AML), and know-your-customer (KYC) compliance. Additionally, robust governance models may be required to manage the development, maintenance, and decision-making processes surrounding the RRC-404 standard and its implementations. User Experience and Accessibility While RRC-404 aims to simplify the concept of NFT liquidity, there may be opportunities to further improve the user experience and accessibility of the platform. This could involve developing intuitive interfaces, educational resources, and user-friendly tools to help individuals and organizations effectively navigate and participate in RRC-404-based ecosystems. Integration with Decentralized Finance (DeFi) Protocols As mentioned in the potential applications section, RRC-404 could enable various integrations with decentralized finance (DeFi) protocols and applications. However, this integration may require the development of new financial instruments, smart contract frameworks, and risk management strategies tailored specifically for tokenized and fractionalized assets. Exploration of Alternative Approaches While RRC-404 presents a novel approach to enabling NFT liquidity, it is essential to continue exploring and evaluating alternative methods or architectures. As the blockchain and cryptocurrency space continues to evolve, new techniques or paradigms may emerge that could further improve upon the concepts introduced by RRC-404. Despite these potential challenges and areas for future development, the RRC-404 implementation on the Radix blockchain represents an exciting step towards unlocking liquidity for non-fungible tokens and bridging the gap between fungible and non-fungible assets. Continued innovation, collaboration, and adaptation will be crucial in realizing the full potential of this emerging technology. ## Root Finance URL: https://radix.wiki/ecosystem/root-finance Updated: 2026-02-06 Summary: Root Finance is a decentralized Lending & Borrowing protocol operating on Radix. The platform enables users to participate as either depositors or borrowers wit Root Finance is a decentralized Lending & Borrowing protocol (https://docs.rootfinance.xyz/) operating on Radix. The platform enables users to participate as either depositors or borrowers within an overcollateralized market environment. As a non-custodial liquidity protocol, Root Finance allows lenders to earn passive income by providing liquidity to the market, while borrowers can obtain funds in an overcollateralized manner. https://youtu.be/ED-e6Wpo01w (https://youtu.be/ED-e6Wpo01w) Key Features - Decentralized Operation: Root Finance operates on the Radix DLT, leveraging blockchain technology to ensure transparency and decentralization. - Non-custodial Design: The protocol is designed to be non-custodial (https://docs.rootfinance.xyz/tecnical-guide/security) , eliminating the need for intermediaries or custodial control of user assets. - Overcollateralization: Borrowers must provide collateral that exceeds the value of their loan, helping to secure the protocol against defaults. - Liquidity Provision: Users can supply various cryptocurrencies to the protocol, earning interest on their deposits. - Borrowing Facility: Users can borrow supported cryptocurrencies by using their deposited assets as collateral. - Root Receipts: Depositors receive Root Receipts (e.g., rtxETH, rtxUSDC) (https://docs.rootfinance.xyz/user-guide/supply-lend) representing their right to redeem supplied assets. - Dynamic Interest Rates: The protocol uses an interest rate model (https://docs.rootfinance.xyz/tecnical-guide/interest-rate-model-of-root-finance) that adjusts based on the utilization rate of the assets in the pool. - Liquidation Mechanism: To maintain the protocol's solvency, positions may be liquidated (https://docs.rootfinance.xyz/liquidations) if they fall below certain collateralization thresholds. - Flash Loans: The protocol offers a flash loan feature, allowing users to borrow without collateral for a single transaction, provided the loan is repaid within the same transaction. Root Finance aims to provide a user-centric platform for the Radix community, enhancing capital strategies through a simple yet comprehensive Lending & Borrowing Decentralized Application (Dapp). The protocol has undergone security audits (https://triwei.io/reports/root-finance-report.pdf) to ensure the integrity and safety of the system, with its smart contracts being open-source and publicly available for review. Protocol Overview Root Finance is a decentralized money market protocol that operates on the Radix DLT. The protocol facilitates lending and borrowing of cryptocurrencies within an overcollateralized environment. This section provides an overview of the core functionality, supported assets, and unique features of the Root Finance protocol. Core Functionality The Root Finance protocol centers around two primary functions: - Lending: Users can supply their assets (https://docs.rootfinance.xyz/user-guide/supply-lend) to the Root Finance platform, becoming liquidity providers. The protocol aggregates these supplied assets into pools governed by smart contracts. Lenders earn passive income through interest generated from borrowers using these pools. - Borrowing: Users can borrow supported cryptocurrencies (https://docs.rootfinance.xyz/user-guide/borrow) by leveraging their deposited assets as collateral. The protocol ensures that all loans are overcollateralized, meaning the value of the collateral must exceed the value of the borrowed assets. The protocol utilizes smart contracts to manage these functions, ensuring that all operations are transparent, secure, and automated. Supported Assets Root Finance supports a variety of cryptocurrencies for both lending and borrowing. As of the latest documentation, the supported assets include: For Supply (Lending): - $WBTC (Wrapped Bitcoin) - $ETH (Ethereum) - $XRD (Radix) - $USDC (USD Coin) - $USDT (Tether) - $HUG - $LSULP (upcoming) For Borrowing: - $WBTC - $ETH - $XRD - $USDC - $USDT - $LSULP (upcoming) Each asset has specific parameters (https://docs.rootfinance.xyz/tecnical-guide/money-market-parameters) such as maximum Loan to Value (LTV) ratios and Liquidation Thresholds, which govern how they can be used within the protocol. Root Receipts When users supply assets to the Root Finance protocol, they receive Root Receipts in return (https://docs.rootfinance.xyz/user-guide/supply-lend) . These receipts (e.g., $rtxETH for Ethereum, $rtxUSDC for USD Coin) represent the user's right to redeem their supplied assets in the future. Key features of Root Receipts include: - Appreciation: The value of Root Receipts grows over time, reflecting the interest earned on the supplied assets. - Fungibility: Root Receipts are fungible tokens, meaning they can be transferred or potentially used in other DeFi protocols. - Interest Accrual: The appreciation of Root Receipts is determined by the interest rates on deposits, which are in turn influenced by the supply and demand dynamics of the assets within the protocol. Interest Rate Model Root Finance employs a dynamic interest rate model (https://docs.rootfinance.xyz/tecnical-guide/interest-rate-model-of-root-finance) that adjusts based on the utilization rate of assets in the pool. This model aims to balance liquidity risk and maximize utilization: - When capital is plentiful (low utilization), interest rates are set low to encourage borrowing. - When capital is scarce (high utilization), interest rates increase to incentivize repayments and additional deposits. The interest rate curve is segmented into two phases around an optimal utilization rate, with a steeper increase in rates beyond this optimal point to manage liquidity risk. Liquidation Mechanism To maintain the protocol's solvency and protect lenders, Root Finance implements a liquidation mechanism (https://docs.rootfinance.xyz/liquidations) . Liquidations occur when an account's collateral value decreases or the borrowed amount increases relative to the collateral, causing the loan-to-value (LTV) ratio to meet or exceed the liquidation threshold. During a liquidation: - Up to 40% of the loan value can be liquidated. - Liquidators receive an incentive, which includes an 8% liquidation fee. - The liquidation process aims to restore the loan to a healthy level of collateralization. This mechanism ensures that the protocol remains solvent and can continue to function even in volatile market conditions. Key Features Root Finance offers several key features that distinguish it in the decentralized finance (DeFi) ecosystem. These features are designed to provide a secure, efficient, and user-friendly platform for lending and borrowing cryptocurrencies. Non-custodial Design Root Finance is designed as a non-custodial platform (https://docs.rootfinance.xyz/tecnical-guide/security) , which means that users retain control of their assets at all times. Key aspects of this non-custodial design include: - Users' assets are secured within immutable smart contracts. - There's no need for intermediaries or custodial control. - The smart contracts are open-source and publicly available for review, adhering to the principle of "don't trust, verify." This design enhances security and aligns with the decentralized ethos of blockchain technology. Decentralized Governance While the documentation doesn't explicitly mention a governance system, the protocol operates on the Radix DLT (https://docs.rootfinance.xyz/) , which suggests a degree of decentralization in its operation. The specific governance mechanisms, if any, are not detailed in the provided documents. Interest Rate Model Root Finance implements a sophisticated interest rate model (https://docs.rootfinance.xyz/tecnical-guide/interest-rate-model-of-root-finance) designed to balance liquidity risk and maximize utilization. Key features of this model include: - Dynamic adjustment based on the Utilization Rate (U) of the asset pool. - A two-phase interest rate curve centered around an optimal utilization rate (Uoptimal). - Gradual rate increase up to Uoptimal, followed by a significant increase beyond this point. - Separate formulas for calculating borrow rates and deposit rates. This model aims to incentivize optimal capital allocation within the protocol. Liquidation Mechanism To maintain the protocol's solvency, Root Finance employs a liquidation mechanism (https://docs.rootfinance.xyz/liquidations) . Key aspects of this mechanism include: - Liquidations are triggered when an account's loan-to-value (LTV) ratio meets or exceeds the liquidation threshold. - Up to 40% of the loan value can be liquidated in a single event. - Liquidators receive an incentive, including an 8% liquidation fee. - The process aims to restore loans to healthy collateralization levels. This mechanism helps protect lenders and maintain the overall health of the protocol. Flash Loans Root Finance offers a flash loan feature, as evidenced by the security audit report (https://triwei.io/reports/root-finance-report.pdf) . Flash loans allow users to borrow assets without collateral, provided the loan is repaid within the same transaction. Key points about flash loans: - They enable complex DeFi operations and arbitrage opportunities. - The audit identified a critical vulnerability in the flash loan mechanism, which was subsequently fixed. - The implementation details and current status of flash loans are not fully described in the provided documentation. Security Measures Root Finance places a strong emphasis on security, as evidenced by several measures: - The protocol has undergone rigorous audits (https://docs.rootfinance.xyz/tecnical-guide/security) by reputable third-party firms. - Smart contracts are open-source and publicly available (https://docs.rootfinance.xyz/tecnical-guide/security) for community review. - The team adheres to best practices in blockchain industry development. Root Points System Root Finance implements a Root Points system (https://docs.rootfinance.xyz/root-points/rules) to incentivize user engagement. Key features of this system include: - Users earn points for interacting with the platform, such as supplying or borrowing assets. - Additional points can be earned through specific quests, like following Root Finance on social media. - A referral program allows users to earn bonus points. - Points are calculated based on the value of assets supplied or borrowed, with different rates for different assets. - A leaderboard displays top wallets based on their Root Points. This system adds a gamification element to the protocol, potentially increasing user engagement and retention. Technical Details Root Finance is built on a complex technical infrastructure designed to ensure security, efficiency, and functionality. This section delves into the technical aspects of the protocol, including its smart contracts, security measures, and integration with the Radix DLT. Smart Contracts Root Finance's core functionality is implemented through a series of smart contracts. The main components, as identified in the audit report (https://triwei.io/reports/root-finance-report.pdf) , include: - Lending Market: The central contract managing lending and borrowing operations. - Single Resource Pool: Handles individual asset pools. - Internal Price Feed: Provides price data for assets within the system. Key smart contract files and their functions include: - lending_market.rs: Implements the main lending market functionalities. - resources.rs: Defines the resources used in the system. - modules/cdp_data.rs: Manages Collateralized Debt Position (CDP) data. - modules/cdp_health_checker.rs: Checks the health of CDPs. - modules/interest_strategy.rs: Implements the interest rate strategy. - modules/liquidation_threshold.rs: Defines liquidation thresholds. - modules/pool_config.rs: Configures individual asset pools. - modules/pool_state.rs: Manages the state of asset pools. These contracts are written in Scrypto, a programming language designed for the Radix DLT. Security Measures Root Finance implements several security measures to protect user funds and ensure the integrity of the system: - Non-custodial Design: User assets are secured (https://docs.rootfinance.xyz/tecnical-guide/security) within immutable smart contracts, eliminating the need for trust in intermediaries. - Open Source Code: The smart contracts are entirely open-source (https://docs.rootfinance.xyz/tecnical-guide/security) and publicly available, allowing for community review and verification. - Third-party Audits: The protocol has undergone rigorous audits by reputable third-party firms (https://docs.rootfinance.xyz/tecnical-guide/security) to ensure the integrity and safety of the system. - Access Control: The system implements access control mechanisms (https://triwei.io/reports/root-finance-report.pdf) to restrict certain functions to authorized users only. - Decimal Precision Handling: The system uses the Decimal type for financial calculations, though the audit report (https://triwei.io/reports/root-finance-report.pdf) notes that there's room for improvement in standardizing decimal precision handling. Audits and Vulnerabilities An independent security audit (https://triwei.io/reports/root-finance-report.pdf) of Root Finance identified several issues: - Critical: A vulnerability in the flashloan mechanism allowing unrestricted burning of TransientResData resources, which could be exploited to avoid repaying flashloans. This issue was fixed. - Medium: - Lack of a liquidator badge revocation mechanism. - Centralized price feed, introducing a single point of failure risk. - Low: - Potential for creation of empty Collateralized Debt Positions (CDPs). - Inconsistent handling of decimal precision in financial calculations. These issues were addressed or mitigated following the audit. User Interaction Root Finance provides a platform for users to engage in lending and borrowing activities. This section details the primary ways users can interact with the protocol, including supplying assets, borrowing, repaying loans, and the liquidation process. Supplying Assets Users can supply their assets to Root Finance to become liquidity providers. The process works as follows: - Asset Selection: Users can supply various cryptocurrencies (https://docs.rootfinance.xyz/user-guide/supply-lend) , including $WBTC, $ETH, $XRD, $USDC, $USDT, and $HUG. - Deposit: Users deposit their chosen assets into the Root Finance protocol. - Root Receipts: In return for their supplied assets, users receive Root Receipts (https://docs.rootfinance.xyz/user-guide/supply-lend) (e.g., $rtxETH, $rtxUSDC). These receipts represent the user's right to redeem their supplied assets in the future. - Interest Accrual: The value of Root Receipts grows over time, reflecting the interest rates on deposits determined by the supply and demand dynamics of the assets. - Withdrawal: Users can withdraw their supplied assets at any time by redeeming their Root Receipts, subject to the available liquidity in the protocol. Borrowing Root Finance allows users to borrow assets by using their supplied assets as collateral. The borrowing process involves: - Collateral: Users must first supply assets to use as collateral (https://docs.rootfinance.xyz/user-guide/borrow) . - Loan-to-Value (LTV) Ratio: Each asset has a specific LTV ratio, which determines the maximum amount that can be borrowed against it. The overall borrowing capacity (https://docs.rootfinance.xyz/user-guide/borrow) is calculated based on the average of the LTV ratios of the supplied assets, weighted by their respective values. - Borrowing: Users can borrow any of the supported cryptocurrencies (https://docs.rootfinance.xyz/user-guide/borrow) up to their borrowing limit. Available assets for borrowing include $WBTC, $ETH, $XRD, $USDC, and $USDT. - Interest Accrual: The borrowed amount accrues interest over time, based on the protocol's dynamic interest rate model. - Risk Management: Users need to monitor the market value of their collateral and borrowed amount to maintain a healthy financial position and avoid liquidation. Repaying Loans Users can repay their borrowed funds at any time using the "Repay" feature. The repayment process includes: - Flexible Repayment: Users can choose to make partial or full repayments (https://docs.rootfinance.xyz/user-guide/repay-your-borrow-on-root) of their loans. - LTV Reduction: Repaying loans lowers the user's LTV ratio (https://docs.rootfinance.xyz/user-guide/repay-your-borrow-on-root) , which is crucial for avoiding liquidation. - Collateral Release: As users repay their loans and their LTV decreases, they can withdraw a portion of their collateral (https://docs.rootfinance.xyz/user-guide/repay-your-borrow-on-root) . Full repayment allows for complete collateral withdrawal. - Interest Savings: Prompt repayment helps users minimize interest accumulation, allowing for better debt management. Liquidations Liquidations are a crucial part of the Root Finance protocol, ensuring its overall solvency. The liquidation process works as follows: - Triggering Conditions: Liquidations occur when an account's collateral value decreases or the borrowed amount increases (https://docs.rootfinance.xyz/liquidations) relative to the collateral, causing the LTV ratio to meet or exceed the liquidation threshold. - Liquidation Threshold: This is set at 5% above the Maximum LTV (https://docs.rootfinance.xyz/liquidations) for each asset. For example, if an asset's Max LTV is 70%, its Liquidation Threshold would be 75%. - Liquidation Process: When a position is flagged for liquidation, the protocol allows liquidation of up to 40% of the Loan Value (https://docs.rootfinance.xyz/liquidations) . - Liquidator Incentive: Liquidators receive an incentive for performing the liquidation. This includes a liquidation fee of 8% of the liquidation value (https://docs.rootfinance.xyz/liquidations) . - User Impact: For the user being liquidated, the liquidation results in a reduction of their debt, but also a loss of a portion of their collateral. - Health Bar: Root Finance provides a Health Bar feature (https://docs.rootfinance.xyz/liquidations) to help users monitor their position's health and proximity to the liquidation threshold. Users are encouraged to actively manage their positions, monitor market conditions, and maintain healthy LTV ratios to avoid liquidation events. Root Points System The Root Points system is an incentive mechanism designed by Root Finance to reward users for interacting with the platform. This system encourages user engagement and loyalty by offering points for various activities within the protocol. Earning Mechanisms Users can earn Root Points (https://docs.rootfinance.xyz/root-points/rules) through several methods: - Asset Supply and Borrowing: For every $100 worth of assets supplied or borrowed, users automatically accumulate Root Points every 24 hours. - Quests: Users can complete specific tasks to earn additional points. - Referrals: Users can earn bonus points by referring new users to the platform. The accumulation of points is based on snapshots of user positions taken at random times throughout each 24-hour period. Quests and Bonuses Root Finance offers several quests and bonuses (https://docs.rootfinance.xyz/root-points/rules) for users to earn additional Root Points: - Social Media Engagement: - Follow Root Finance on X (formerly Twitter): +10 Root Points (one-time bonus) - Join the Root Finance Telegram: +10 Root Points daily - Asset Deposit: - Deposit at least $200 USD worth of assets: Earn Root Points daily based on the value deposited - Asset Borrowing: - Borrow at least $100 USD worth of assets: Accumulate Root Points daily based on the value borrowed - Referral Program: - Invite a friend to join Root Finance: Earn an additional 10% of the Root Points they accumulate every 24 hours Point Calculation The number of Root Points earned depends on the type and amount of assets supplied or borrowed: - Supply Bonus: - For every $100 worth of $XRD, $WBTC, $ETH and $LSU (https://docs.rootfinance.xyz/root-points/rules) : 1 Root Point every 24 hours - For every $100 worth of $USDC and $USDT (https://docs.rootfinance.xyz/root-points/rules) : 2 Root Points every 24 hours - For every $100 worth of $HUG (https://docs.rootfinance.xyz/root-points/rules) : 3 Root Points every 24 hours - Borrow Bonus: - For every $100 worth of $USDC, $USDT, $XRD, $WBTC, $ETH and $LSU (https://docs.rootfinance.xyz/root-points/rules) : 3 Root Points every 24 hours Verification Process To ensure fair distribution of points, Root Finance employs a verification process (https://docs.rootfinance.xyz/root-points/rules) : - Snapshots of users' positions are taken at random intervals throughout the 24-hour period. - Assets must remain in place during the day to ensure eligibility for the points earned. - If a user's balance changes or falls below the required threshold, their points for that day may be affected. Leaderboard Root Finance maintains a leaderboard (https://docs.rootfinance.xyz/root-points/rules) that displays the top wallets based on their Root Points. This feature adds a competitive element to the points system, potentially driving further engagement with the platform. Future Utility While the current documentation doesn't specify the future utility of Root Points, such point systems in DeFi protocols are often used for: - Governance participation - Access to exclusive features or products - Discounts on protocol fees - Potential token airdrops However, it's important to note that these are speculative uses and not confirmed features of the Root Points system. Risks and Challenges As with any decentralized finance (DeFi) protocol, Root Finance faces various risks and challenges. Understanding these is crucial for users, developers, and stakeholders involved with the platform. Smart Contract Risks Smart contract vulnerabilities pose significant risks to DeFi protocols. For Root Finance, several issues were identified in a security audit (https://triwei.io/reports/root-finance-report.pdf) : - Critical Vulnerability: A flaw in the flashloan mechanism was discovered that could allow unrestricted burning of TransientResData resources, potentially enabling users to avoid repaying flashloans. This issue was reported as fixed after the audit. - Access Control: The audit revealed a lack of a liquidator badge revocation mechanism (https://triwei.io/reports/root-finance-report.pdf) . This could potentially lead to security issues if a liquidator's rights need to be revoked. - Empty CDP Creation: The potential for creating empty Collateralized Debt Positions (CDPs) (https://triwei.io/reports/root-finance-report.pdf) was identified as a low-risk issue. - Decimal Precision: Inconsistent handling of decimal precision (https://triwei.io/reports/root-finance-report.pdf) in financial calculations was noted, which could lead to small discrepancies that may accumulate over time. While these issues were addressed or mitigated following the audit, the existence of such vulnerabilities highlights the ongoing challenge of maintaining secure smart contracts. Market Risks Market risks are inherent to any financial system, particularly in the volatile cryptocurrency market: - Price Volatility: Rapid price fluctuations in cryptocurrencies can lead to sudden changes in collateral values, potentially triggering liquidations (https://docs.rootfinance.xyz/liquidations) . - Liquidity Risk: In periods of high market stress, there might be insufficient liquidity for users to withdraw their assets or for liquidations to occur efficiently. - Interest Rate Fluctuations: The dynamic interest rate model (https://docs.rootfinance.xyz/tecnical-guide/interest-rate-model-of-root-finance) used by Root Finance, while designed to balance supply and demand, could lead to rapid changes in borrowing costs or lending yields. Operational Risks Several operational risks were identified or can be inferred from the provided documentation: - Centralized Price Feed: The audit report (https://triwei.io/reports/root-finance-report.pdf) noted a centralized price feed as a medium-severity issue. This introduces a single point of failure risk, where inaccurate price data could potentially disrupt the entire system. - Radix DLT Dependence: As Root Finance is built on the Radix DLT (https://docs.rootfinance.xyz/) , it is exposed to any potential issues or limitations of this underlying blockchain platform. - Liquidation Mechanism: While designed to maintain system solvency, the liquidation process (https://docs.rootfinance.xyz/liquidations) could potentially lead to significant losses for users if not managed properly or if market conditions are extremely volatile. Regulatory Challenges While not explicitly mentioned in the provided documentation, DeFi protocols like Root Finance often face regulatory challenges: - Regulatory Uncertainty: The rapidly evolving nature of DeFi means that new regulations could potentially impact the operation of protocols like Root Finance. - Cross-border Transactions: As a decentralized platform, Root Finance may facilitate transactions across different jurisdictions, potentially raising complex legal and regulatory questions. - KYC/AML Compliance: The non-custodial and decentralized nature of the protocol may present challenges in complying with Know Your Customer (KYC) and Anti-Money Laundering (AML) regulations, which are increasingly being applied to cryptocurrency platforms. It's important to note that the specific regulatory challenges faced by Root Finance are not detailed in the provided documentation, and the actual impact of regulations may vary depending on the jurisdictions involved and the evolving regulatory landscape. ## Religant URL: https://radix.wiki/ecosystem/religant Updated: 2026-02-06 Summary: Religant is a decentralized oracle service designed to provide accurate and up-to-date price data feeds on the Radix network. Religant is a decentralized oracle service designed to provide accurate and up-to-date price data feeds on the Radix network. Functionality The Religant Oracle protocol operates by aggregating data from various sources. A network of data nodes queries exchanges for current transaction data and submits this information to a contract. This contract employs a consensus mechanism to determine the aggregate price. Data nodes, running the Religant node backend software, update the feed at regular intervals or when the new aggregate price exceeds a specified divergence threshold, currently set at 2%. This threshold ensures that the rate from the Religant oracle's feed remains within 2% of actual transaction rates on exchanges. Component API The user-facing API of the Religant Oracle is straightforward, consisting of a single method, get_price, which returns a PriceData structure containing the current XRD/USD exchange rate and a timestamp. The Option wrapper in the return type ensures type safety, with the method usually returning a valid result after the initial round of Oracle feed aggregation. Integration Guide Integrating the Religant Oracle into an application requires several steps (Religant%208f8511c04231415ab9733ba803ad85bf.md) : - Defining the PriceData structure outside of the application blueprint, using various traits and types like ScryptoSbor, USDValue, and POSIXTime. - Declaring the component's API within the blueprint module using an extern_blueprint! macro. - Instantiating a reference to the global Religant component within the blueprint. - Using the Religant component's methods within the application, such as retrieving the current exchange rate and minting tokens based on this rate. ## Reizor URL: https://radix.wiki/ecosystem/reizor Updated: 2026-02-06 Summary: Reizor is an innovative platform designed to revolutionize the way fans and artists connect and interact. The platform offers a range of features that enable fa Reizor is an innovative platform designed to revolutionize the way fans and artists connect and interact. The platform offers a range of features that enable fans to not only be spectators but to actively participate in and influence live performances, and even connect with artists on a personal level. Additionally, the platform provides a social environment for fans to connect with each other and stay engaged with their favorite artists continuously. Features - Reizor Tokens (https://reizor.com/reizorexp) : By becoming an active member of the global music community with Reizor, users can unlock exclusive perks and experiences with just a few clicks and a small number of Reizor tokens. - Hunting (https://reizor.com/hunting) : Reizor’s AR technology allows users to ‘hunt’ for tokens anytime, anywhere. Users receive notifications when they are near a token, and they can use their camera to hunt for it. With geolocation, users can search for tokens worldwide. The platform also offers a unique way for fans to connect and earn more tokens by meeting up with like-minded fans outside of venues. By signing up, users can start earning tokens based on their location. These tokens can be used to unlock badge levels or exchanged for concert tickets. The more a user hunts, the more rewards they earn, enhancing their loyalty and engagement with the platform. - Reizor Events (https://reizor.com/reizorexp) : The platform offers events where users can feel the thrill of the hunt and get rewarded. By signing up and earning tokens based on their location, users can unlock badge levels or exchange them for concert tickets. - Reizor Rewards (https://reizor.com/reward) : This feature allows users to become active members of the global music community. With just a few clicks and a small number of Reizor tokens, users can unlock exclusive perks and experiences. - Reizor Badge System (https://reizor.com/reizorexp) : The thrill of the hunt is amplified with the Reizor Badge System. Users can sign up, earn tokens based on their location, and use these tokens to unlock badge levels or exchange them for concert tickets. - Reizor Marketplace (https://reizor.com/marketplace) : An integral part of the platform, the Reizor Marketplace allows users to become active members of the global music community. With just a few clicks and a small number of Reizor tokens, users can unlock exclusive perks and experiences. Team The team behind Reizor comprises individuals with diverse expertise: - Shai S.: The founder of Reizor, Shai S. (https://reizor.com/team) , is a serial entrepreneur in the real estate and tech field with over eighteen years of experience. - Erez Eisen: A world-renowned DJ (Infected Mushroom), Erez (https://reizor.com/team) began classical music training at a young age. - Alon Mantsur - Cybrella: With over 20 years of experience in Cyber Security, Alon (https://reizor.com/team) brings significant business value to companies by providing cybersecurity solutions. - Sam Bamuel: A VR/AR Specialist and Creative Producer of video and XR experiences based in New York City, Sam (https://reizor.com/team) has over 15 years of experience helping various organizations and artists tell stories with new and traditional media. Advisory Board The advisory board of Reizor (https://reizor.com/team) includes individuals like Avi Sasson, the Head of Online Marketing, Avi Weber “Esq”, the Legal Compliance of Counsel, Cebric Lebeau, the Dev and Technical Advisor, and many more. Benefits Reizor is revolutionizing the way artists and fans interact by creating a genuine connection between them. The platform offers unparalleled engagement opportunities for fans with their favorite artists. Moreover, Reizor (https://reizor.com/home) empowers artists to recognize and connect with their most dedicated fans, transforming passive fans into active ones through recognition and engagement. The platform also allows users to experience concerts, parties, and festivals in a stunning 4k 360-degree view, all from the comfort of their homes. Reizor Experience Reizor (https://reizor.com/reizorexp) offers a unique experience called "ReizorExp" that is designed to increase user engagement, reward users, grow the community, and boost online reputation. The platform turns user actions into valuable tokens, and the more tokens a user accumulates, the higher their "Reizor Badge Level" becomes. This unlocks access to special discounts, exclusive products, and parties. ## RDX Works URL: https://radix.wiki/ecosystem/rdx-works Updated: 2026-02-06 Summary: 👋 Open Positions:: Blockchain Integration Specialist , Web3 Integration Engineer , Web3 Solutions Engineer (https RDX Works is a product studio (https://www.rdx.works) best known as core contributors to the Radix network. The company was founded by Dan Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) in 2017 (https://find-and-update.company-information.service.gov.uk/company/10864928/officers) as Radix DLT (https://find-and-update.company-information.service.gov.uk/company/10864928) but changed its name on 10th November, 2021. History RDX Works was originally founded as Radix DLT by Dan Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) in 2017. The company changed its name to RDX Works in November 2021. In October 2020, RDX Works raised £9.82m (https://www.uktechnews.info/2020/10/29/radix-dlt-secures-9-82-million-series-a-follow-on-investment/) from its token sale of $eXRD. Investor Saul Klein was removed as a director (https://s3.eu-west-2.amazonaws.com/document-api-images-live.ch.gov.uk/docs/w3_6OCsgR_YDqM-l8W6JADIqfydZsRK6qsX3cqD1VIw/application-pdf?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=ASIAWRGBDBV3CMQOBKVA%2F20230705%2Feu-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230705T112105Z&X-Amz-Expires=60&X-Amz-Security-Token=IQoJb3JpZ2luX2VjELH%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCWV1LXdlc3QtMiJGMEQCIH3YtYxOD%2Fzx6%2F3iJ0nRMMYR9NusYi2eYRmvMh8cBKJtAiBCzsqoPZbmLtK4Z7jUsUSUEuomA4Yz%2Bo7ZIpWsb%2Bi1jCq7BQgqEAQaDDQ0OTIyOTAzMjgyMiIMSjIyfiZ1MyCwPCYVKpgFTLRYD2aqLa9L0q%2BeSYF0XTXgbON1y6l9v1hFuZAJcuQ7fargJCGEfeN%2FauaUkS2kS6LA6QStUlUQd9iXy3eQ9Tar%2FIA4z9fxPz57DIghzOTyPEoUQJflV5V4Dl0xUzi2ZXmOkAfvZXsRaievfKTIE3vPMbecc3ZZYuhAjYKCoyfSwTX%2BIu6IUzOlNOKO26ZQuuBpKz%2FFA%2B2qwt2OsXs85kb33OKCTv45hmLjb1rTJYcMYWBPPe2rdi7jkVvBUCCjDAyT7tcy2vHM8bRiKwdcUpS2eLcF8F5xqc0mKEJfPBw%2F2AUD5DIlWzGi2EyoADC90ucKRZgPQ8RyMfPRIpp5jz4%2B7Vgt%2BgTjlt773NbNecwqAO9hVf8Fm4fHfBgi9BABp8VbNxGWgeUFt45F4jTGknzEyprkNhm9Dp20w5Em1tpx7moMybIItjKMMIswwe7leSfqhbrsbcJzVJT1Hs0Ieac%2F4Yroj4XopodheKlu%2F7qml3VQ5aaZLYSCf3ZiSTytHduht1Qpdm2CFsSwh34181wD0PNPrjan1AkkE7wBuo4%2FH7Evrtcg9k8HWf5Y9H9N5qALyxTaS6SIMVSO%2FH6qX%2Fa4weuVgGK3DGPASYGT9%2FrhpJh3OZ4KvyRYPixz9zBiDW33HeLbiOSUsSF0q62p6EUnnUVySy4QEgMGG5vmH5CAtemdgt%2Fg0M4s3Ij3roLZ7Vc795D7w1BmidxIybFunvdm42QEWyr8zZLlVC5g9EI2EiBOkNtn1bV%2BYNSDxkA3r5t9QcED1LsI04cfgsaoQuSckgCm8p5bQGdas9yHHzDfKyZCpuOsJWlaTUWm%2Fh0w8by%2BFoWhgQoXG6PNF8Hnc4%2Fd29AJ%2BStTbfZf2wpLxa9dk3CEgLavfjDb5pSlBjqyAU%2FK5e5VqjFvC8zu%2FIZ%2BVDK4i7a6KhvgJtJcf8WSl5R0ehVXe0%2BPOYtLttHm%2BO6HBj25b1QtCN29XItUlQ2KX6gDt2yW1SUw7bREZ3LawlK%2B%2FxQzlh9di7SAEq4%2BYPvWTfqt1fIJ%2BCFMdJnJsysofFZPnk%2BhK7H4xw3GYSfTKrORk3dsJ8IWMun00XfWB3gxt%2Ftm2gGFIBsSt4qnPDDsY1WECRZH8v4QpRjCo%2Fu8GnfM1EE%3D&X-Amz-SignedHeaders=host&response-content-disposition=inline%3Bfilename%3D%2210864928_tm01_2023-06-06.pdf%22&X-Amz-Signature=7259a38efb3c22fc618b62f428bff1943caa0e32dbbadd519e837596cfa8aeb7) on the 5th June 2023. Products RDX Works has a mission to provide builders with the tools they need to replace traditional financial systems with a highly interconnected network of user-first DeFi apps and assets. These tools are built on top of the Radix public ledger, a record of all transactions that have ever occurred on the Radix Public Network. Radix DLT RDX Works describes Radix DLT as the only decentralized network where developers can build quickly without the constant threat of exploits and hacks. It is designed to reward every improvement and aims to never be a bottleneck in terms of scale. Instapass (https://www.instapass.io) Instapass is designed to make compliance with Know Your Customer (KYC) and Anti-Money Laundering (AML) regulations simple for DeFi users, protocols, and applications 1 (https://www.rdx.works/) . Instabridge (https://www.instabridge.io) This tool is intended to simplify the process of moving crypto assets between different financial ecosystems. GoodFi (https://www.linkedin.com/company/goodfi/) GoodFi is a not-for-profit initiative that aims to make entering the world of DeFi as easy as possible. Its members include Aave, Sushiswap, Avalanche, and others. Controversies 2019/12/13 - Mr R Olsen vs Radix DLT Limited ( 2202365 (https://assets.publishing.service.gov.uk/media/5e15f2d0e5274a06b05f2597/Mr_R_Olsen__vs_Radix_DLT_Limited_-_Judgment.pdf) ) Mr Robert Olsen (https://www.attestant.io/team/) presented a claim of unfair dismissal against Radix DLT Limited. Following an early conciliation period, the claimant initially had legal representation. However, by 31 October 2019, the firm of solicitors that represented the claimant ceased to act for him. Subsequently, the respondent's response was sent directly to the claimant's email. Despite the respondent's numerous attempts to engage with the claimant, they faced challenges with communication, as the claimant was largely unresponsive. On 29 November 2019, Mr Olsen sent a communication to the tribunal stating that his legal representation had abandoned him and requested an extension. There was no response from the tribunal to this request. Ms G Leadbetter, representing the respondent, made an application during the hearing on 13 December 2019 to dismiss the claimant's case under rule 47 of the tribunal rules, which states that a case may be dismissed if a party fails to attend or represent themselves. The respondent posited that the claimant might be in Dubai, attending a conference, based on information from a mutual contact. Further, while the claimant did not communicate with the respondent's solicitors, he did maintain sporadic social media contact with the respondent's director. Employment Judge E Burns confirmed no other correspondence from the claimant had been overlooked and subsequently chose to exercise discretion under rule 47. The judge concluded that there was a pattern of non-engagement by the claimant, asserting that the claimant's absence was voluntary and indicative of his lack of interest in pursuing his claim. Consequently, the claim was dismissed. 2020/08/26 - Radixx Solutions International, Inc. vs Radix DLT Limited ( 79240843 (https://tsdr.uspto.gov/#caseNumber=79240843&caseType=SERIAL_NO&searchType=statusSearch) ) 2021/01/25 - Radixx Solutions International, Inc. vs Radix DLT Limited ( 91267279 (https://ttabvue.uspto.gov/ttabvue/v?pno=91267279&pty=OPP) ) 2021/08/11 - Radixx Solutions International, Inc. & Radix DLT Limited vs Radix IoT, LLC ( 90165341 (https://ttabvue.uspto.gov/ttabvue/v?pno=90165341&pty=EXT) ) 2021/08/11 - Radixx Solutions International, Inc. & Radix DLT Limited vs Radix IoT, LLC ( 90165249 (https://ttabvue.uspto.gov/ttabvue/v?pno=90165249&pty=EXT) ) 2021/08/11 - Radix IoT, LLC vs Radix DLT Limited ( 88288076 (https://ttabvue.uspto.gov/ttabvue/v?pno=88288076&pty=EXT) ) 2021/09/16 - Radixx Solutions International, Inc. vs Radix DLT Limited ( 90153774 (https://ttabvue.uspto.gov/ttabvue/v?pno=90153774&pty=EXT) ) ## RadUp URL: https://radix.wiki/ecosystem/radup Updated: 2026-02-06 Summary: RadUP is a validator node that is part of the Radix network. RadUp.io is a validator service committed to providing high performance validator nodes for the Rad RadUP is a validator node that is part of the Radix network. RadUp.io (http://radup.io/) is a validator service committed to providing high performance validator nodes for the Radix community. Overview RadUp have been involved as a validator on the Radix network since its inception and have been chosen as a validator on both the Olympia betanet and Babylon betanet. RadUp.io offers a high-quality, low-cost, and reliable validator service. They aim to serve the Radix network and community by running high-performing validator nodes. Mission The Mission of the RadUp validator is to ensure the integrity and reliability of transactions on the Radix network, helping to maintain the overall stability and trustworthiness of the network Fee RadUP charges a fixed fee of 1.95% (https://www.radup.io/our-fee) for staking services, ensuring that stakers receive more than 98% of the total rewards associated with their stake. RadUp aims to maintain a sustainable and fair fee structure. Features RadUp validator offers several features and benefits for stakers. Here are some of the key features: High Performance RadUp validator is designed to provide a high-performance node for staking XRD (Radix) tokens. They aim to ensure efficient and reliable operation to maximize rewards for stakers. Fair Fee Structure RadUp charges a fixed fee of 1.95% for their staking services. This fee structure is intended to be fair and transparent, ensuring that stakers receive more than 98% of the total rewards associated with their stake. Sustainable Rewards RadUp makes a promise of no fee increases during the duration of Olympia, which is the current phase in the Radix network development. This commitment ensures that stakers can earn predictable and consistent rewards without unexpected fee changes. Easy Staking Staking with RadUp is made easy and straightforward. Stakers only need XRD tokens, the Radix desktop wallet, and the address of the RadUp validator node to start staking their tokens. Visibility and Transparency RadUp provides visibility into the validator nodes and their percentages of total stake. This transparency allows stakers to make informed decisions about their stakes and choose the validator that aligns with their preferences and goals. Staking Process To stake (https://www.radup.io/staking-with-us) with RadUP Validator, you can follow these general steps: Obtain Radix (XRD) tokens You need to acquire Radix (XRD) tokens to participate in the staking process. You can purchase these tokens from cryptocurrency exchanges that support Radix. Set up a Radix wallet Choose a Radix wallet that supports staking and is compatible with Radix tokens. The Radix desktop wallet is recommended for staking. Transfer XRD tokens to your wallet Once you have a Radix wallet, transfer the XRD tokens you acquired to your wallet address. Connect your wallet to Radix network Open your Radix desktop wallet and connect it to the Radix network. Follow the wallet's instructions to set up and connect. Delegate to the RadUP validator In your wallet settings, look for the staking or delegate section. Enter the validator address specifically for RadUP.io: rv1q2twzsq2eqd9et4lnh745k0r7heqp2d2maj57sh5eswj2rdrjex250d832l[^6]. Confirm and complete the delegation process following the wallet's instructions. Wait for confirmation It may take some time for the delegation to be confirmed on the network. Once confirmed, your tokens will be staked with the RadUP Validator. Service and Trust RadUp validator emphasizes security and trust as essential pillars of their services. Here are some key points regarding their security measures and efforts to build trust. Validator Experience RadUp has been involved as a validator on the Radix network since its inception. They have been selected as a validator on both the Olympia betanet and Babylon betanet. Their active participation demonstrates their experience and technical expertise in running a validator node. Transparency RadUp provides visibility into their validator nodes and their percentages of total stake. This transparency allows stakers to see the distribution of stake across validators and make informed decisions. Performance and Reliability RadUp focus on providing a high-performance validator node to serve the staking needs of the Radix community. Their aim is to ensure efficient and reliable operation to maximize rewards for stakers. Trustworthiness and Reputation RadUp is committed to being a trusted validator for stakers. They have established themselves as an active member of the Radix node running and wider community. These efforts collectively contribute to building a secure and trustworthy environment for stakers who choose RadUp validator. ## Radstakes URL: https://radix.wiki/ecosystem/radstakes Updated: 2026-02-06 Summary: Radstakes is a staking service founded by Faraz Abulhawa. The service offers a secure and reliable infrastructure for users to delegate their Radix native token Radstakes is a staking service founded by Faraz Abulhawa (https://www.notion.so/Faraz-Abulhawa-9772e9a3d14340a3a65704c5de145372?pvs=21) . The service offers a secure and reliable infrastructure for users to delegate their Radix native token $XRD and participate in the staking process. Overview Radstakes has been operating a validator node in the top 100 since the mainnet launch, with prior testing on Betanet and Stokenet. Radstakes emphasizes the importance of security, decentralization, and performance in their validator operations. They strive to contribute to the security and decentralization of the Radix network by encouraging users to delegate their $XRD tokens to their validator node. Staking with Radstakes offers benefits such as potential staking rewards and participation in securing the Radix network. By staking with Radstakes, users can actively support the operation and growth of the Radix ecosystem while earning rewards for their contributions Source: Radstakes Blog - Radstakes Partners with Radix Planet (https://www.radstakes.com/blog) Product The products offered by Radstakes include stake delegation services, secure infrastructure with high uptime and dedicated servers, high availability and DDoS protection, as well as backup validator nodes to ensure maximum uptime and staking rewards. Mission The mission of Radstakes is to provide a trustworthy and secure staking service to the Radix network, allowing users to participate in the network's consensus protocol and earn staking rewards. Radstakes aims to contribute to the security and decentralization of the Radix network by operating as a reliable validator and promoting community engagement. Benefits Using Radstakes offers several benefits. By delegating tokens to a reliable validator, users can participate in the staking process without the need for technical expertise or setting up their own infrastructure. Stakers also have the opportunity to earn staking rewards, which can be a passive income stream from holding and delegating their $XRD tokens. Additionally, by participating in staking, users contribute to the security and decentralization of the Radix network. Staking Process The staking process with Radstakes typically involves the following steps: Connect your Wallet Begin by connecting your wallet that holds your Radix native token ($XRD). Radstakes supports various wallets, and you will need to connect your preferred wallet to the Radstakes platform. Delegate your $XRD Tokens Delegation is done through the Radstakes platform. Monitor Staking Rewards As you delegate your $XRD tokens to Radstakes, you become eligible to receive staking rewards. These rewards are typically distributed based on factors such as the number of tokens you have delegated and the duration of the delegation. You can monitor your staking rewards through the Radstakes platform or your connected wallet. Manage Delegation Radstakes provides tools to manage your delegation. You may have the flexibility to adjust your delegation amount, claim staking rewards, or withdraw your delegated tokens if needed. These management functions may vary based on the specific features offered by Radstakes. Security and Trust In terms of security, Radstakes aims to provide a secure infrastructure for staking operations. They utilize high uptime secure dedicated servers with DDoS protection and backup validator nodes to ensure maximum uptime and security. However, it's important for users to exercise caution and conduct their own due diligence when choosing a validator and participating in staking activities. ## Radnode URL: https://radix.wiki/ecosystem/radnode Updated: 2026-02-06 Summary: Radnode is a first-class validator and staking platform for Radix DLT (Distributed Ledger Technology), providing professional-grade staking services for cryptoc Radnode is a first-class validator and staking platform for Radix DLT (Distributed Ledger Technology), providing professional-grade staking services for cryptocurrency enthusiasts, institutional investors, and developers. The platform leverages over a decade of professional full stack software development and system administration experience to provide a secure and reliable platform for Radix DLT validation. History Radnode was selected as a node runner under the Radix Betanet Programme, an initiative aimed at testing and improving the performance of the Radix DLT. The selection was based on Radnode's technical competence, experience, and its dedication to serving the Radix DLT community. Services Radnode provides a suite of services to enhance the security, reliability, and profitability of Radix DLT staking. Node Operation Radnode operates hardened nodes, which consist of high-quality, bare-metal hardware that operates in multiple availability zones. These nodes are meticulously maintained with strict server maintenance, redundancy, and fallback protocols in place. They are monitored 24/7, with instant inconsistency and issue detection alerts ensuring maximum node availability. DDoS mitigation and multi-region server tenancy further enhance the security and reliability of the service. Communication Radnode maintains multiple channels of communication with regular updates to keep its users informed. As an active member of the Radix DLT community, the platform is committed to providing users with timely and relevant information. This includes updates about server maintenance, staking performance, industry news, and more. Stake Delegation Users can delegate their stake to Radnode at a low fee of 0.75%. Radnode also partners with other qualified and KYC'd node runners, encouraging a distributed staking environment that enhances the resilience and security of the Radix DLT. Reinvestment Radnode reinvests its earnings into the continuous optimization of its node service, the development of tools for its delegators, and contributions to the Radix community. KYC Verification Radnode is fully Know Your Customer (KYC) verified by the Radix Foundation. This is to ensure the security and trustworthiness of the platform. Roadmap Radnode has an outlined roadmap for the continuous development and improvement of its services. This includes the development of a delegator platform, upgrades to the Crankshaft GaaS, and the release of a Babylon node platform. Contact Radnode can be reached through various social media channels, including Twitter (@radnode), Instagram (@radnode), and Github (github.com/radnode). For direct contact, Radnode can be reached via Telegram (@radnode) or email ( contact@radnode.io (mailto:contact@radnode.io) ). ## Radlock URL: https://radix.wiki/ecosystem/radlock Updated: 2026-02-06 Summary: RadLock is a project built on the Radix ecosystem, providing services aimed at increasing safety and trust in new projects. The platform offers token locking, t RadLock is a project built on the Radix ecosystem, providing services aimed at increasing safety and trust in new projects. The platform offers token locking, team vesting, and a launchpad for new projects. RadLock's mission is to provide investors and project founders with the tools they need to make sound investments. Towards the Storm: Interview with BEEM from Radlock (https://youtu.be/zA_i38A4agc?si=yw33pGLRe13imfOC) Towards the Storm: Interview with BEEM from Radlock Services Token Locking RadLock allows project builders to lock tokens, showing investors the percentage and length of the lock. This service also extends to liquidity provider tokens resulting from providing liquidity on a Decentralized Exchange (DEX). All locks are made public for investors to view, and RadLock provides a "trust score" for tokens based on various parameters. Token Vesting Project token owners can create vesting schedules through RadLock, where the release (or unlocks) of tokens are broken down into smaller amounts, ensuring less price volatility. Vesting schedules support teams and provide a multi-signature mechanism to add, remove, and update shares for tokens. All vesting schedules are made public for investors to view. ICO Launchpad RadLock allows users to mint tokens through their platform. Tokens can be locked or vested in one action. The launchpad also supports a presale service where project token owners can set a presale price, duration, and target amount to raise. RadLock Token Holders of the RadLock token ($RLOCK) have premium access and discounts to RadLock services as well as a fee share. Prior to the Radix Babylon release, RadLock gave people the opportunity to get $RLOCK tokens through public presales and airdrops. The total supply of $RLOCK is 100,000,000. Team The RadLock team consists of experienced software developers with over 5+ years in software development with varying experiences as contractors, employees, and entrepreneurs. The team includes: - Snjak: Over 7 years of experience working at an international level in the travel and energy sectors. - **Talesofbeem (https://www.notion.so/Beem-7110da825153456faf0dc1d15055fd6d?pvs=21) :** A software developer with over 6 years of experience in the financial industry, particularly in payments and trading platforms. He is passionate about programming languages and Scrypto. - **Slysmik (https://www.notion.so/Slysmik-9b7d579ff69d49d28f27724f55b8d8c1?pvs=21) :** A creative director with over 20 years of experience working with high-level international clients in both private and public sectors. ## Radland URL: https://radix.wiki/ecosystem/radland Updated: 2026-02-06 Summary: Ancient Chiggimps (VikingLand) Trond (VikingLand) BCD (Scrypto) (front-end) Status: 💀 Ancient Chiggimps (VikingLand) Trond (VikingLand) BCD (Scrypto) https://www.notion.so/60fb6af894c543f1bb26a34a0b2a5d31?pvs=21 (https://www.notion.so/60fb6af894c543f1bb26a34a0b2a5d31?pvs=21) (front-end) Status: 💀 RadLand is an NFT marketplace built on the Radix (https://www.radixdlt.com/) network. Launched in 2024, it aims to be a premier destination for minting, trading, and discovering NFTs on Radix. https://youtu.be/NOQ56qZbQsg?si=qPjyDGFAF0GLous_ (https://youtu.be/NOQ56qZbQsg?si=qPjyDGFAF0GLous_) History RadLand was founded in 2024 through a collaboration between the teams behind two early Radix NFT marketplaces - Radish Square (https://radishsquare.com/) and VikingLand (VikingLand%20083e2a138eaa407ea631821a19705604.md) . The goal was to combine the UI/UX expertise of Radish Square with the backend infrastructure of VikingLand to create a best-in-class NFT platform. The founding team consists of experienced Web3 builders including Mik (Radish founder), Ancient, Chiggimps, Trond (VikingLand founders), BCD (Scrypto specialist), and Kamil (https://www.notion.so/60fb6af894c543f1bb26a34a0b2a5d31?pvs=21) (front-end developer). Their collective experience in building successful marketplaces underscored the potential for a unified NFT hub on Radix. Features Key features of RadLand include: - Bulk minting - Streamlined system for minting multiple NFTs at once - Intuitive UI - User-friendly interface for buying, selling, and managing NFTs - Olympia migration - Tools for claiming NFTs minted prior to Babylon upgrade - Royalty payments - Built-in royalty schemes to compensate creators - Random minting - Ability to mint random NFTs from a collection - Auctions - Native auction system for price discovery - Sniping tools - Features for traders to acquire sought-after NFTs - APIs - Robust API suite for integrating with other platforms Technology RadLand utilizes: - Babylon - Latest iteration of the Radix public ledger - Scrypto - Secure smart contract language built for Radix - Cerberus - Radix node API used to access the ledger By building natively on Radix technology, RadLand is positioned to be a scalable, secure, and decentralized NFT marketplace. Roadmap Upcoming milestones for RadLand include: - Streamlining the migration of Olympia NFTs - Prioritizing community feedback for platform improvements - Balancing creator and user needs through thoughtful feature development - Expanding functionality around trading, analytics, integrations, and more ## RadKET URL: https://radix.wiki/ecosystem/radket Updated: 2026-02-06 Summary: Radket is an online marketplace that accepts only cryptocurrencies as payment method. It is the first of its kind and aims to provide a platform for buyers and Radket is an online marketplace that accepts only cryptocurrencies as payment method. It is the first of its kind and aims to provide a platform for buyers and sellers to trade using cryptocurrencies. The project was initiated in 2021 and is currently in development. Overview The RadKET marketplace is an online platform that allows users to browse and purchase products using cryptocurrencies. It aims to build a fully functional and effective marketplace available in multiple countries, with $RDK (RadKET's native token) and several stable and non-stable cryptocurrencies accepted as payment methods. The marketplace is expected to be released in the first quarter of 2024, starting with a beta testing phase and the recruitment of initial sellers. RadKET is committed to partnering with well-known brands and sellers to offer a wide range of products on their platform. Their goal is to create a user-friendly cryptocurrency marketplace, similar to traditional marketplaces like Amazon, where users can buy various supplies and items using their favorite cryptocurrencies The RadKET team has also developed a tokenomics model for their platform. The $RDK token will be used extensively on RadKET, and users will need $RDK to access various benefits on the platform, including making purchases. Mission The mission of RadKET Marketplace is to create a global online marketplace that bridges the gap between traditional e-commerce and the cryptocurrency world. They aim to provide a platform where users can easily buy and sell products using various cryptocurrencies, making it more accessible for cryptocurrency holders to utilize their digital assets for everyday purchases. RadKET is focused on creating a user-friendly and familiar shopping experience, similar to traditional online marketplaces like Amazon, while incorporating the benefits and convenience of using cryptocurrencies as a payment method. By accepting multiple stable and non-stable cryptocurrencies, RadKET aims to offer a wide range of payment options to cater to the preferences of different users. Additionally, RadKET aims to partner with well-known brands and sellers to offer a diverse selection of products on their platform, ensuring a high-quality shopping experience for its users. Overall, RadKET's mission is to encourage the widespread adoption of cryptocurrencies by providing a secure, user-friendly, and globally accessible marketplace for buying and selling goods. Roadmap Here is a roadmap (https://www.radket.com/) from radket: Q4 2021 - RadKET’s birth - RadKET general conceptualization, thinking about how the final marketplace & airdrop will work, as well as the tokens distribution Q1/Q2 2022 - Creation of our token $RDK - Creation of the documentation - Beginning of the general website & airdrop website developpement Q3/Q4 2022 - Social media creation(twitter-telegram), beginning of the online marketing - Tokenomics creation - Beginning of the first $RDK giveaways - Release of the website, the documentation & beginning of the airdrop 2023 - Will be updated during 2023 Q1 2024 - Beta test, recruitment of the first sellers and then release of the final marketplace, fully functional and effective in several countries with $RDK and several stable and non-stablecoins as payment method - As radket will be our main product, after the release we will continue to work on it to improve it and make it the best crypto-only marketplace in the world Tokenomics Here is tokenomics (https://docs.radket.com/radket/overview/tokenomics) from Radket. Airdrop minigame: 26m $RDK Those tokens will all be distributed through the airdrop website. Everything should be gone until we release the final marketplace -- In case there are remaining tokens when we want to release the marketplace(which is unlikely to happen), we will distribute them them more quickly or giveaway them fairly to the airdrop game users. When the airdrop is over, 72% of the tokens **will be circulating on the market. Giveaways: 10m $RDK Tokens that are being distributed mainly through Twitter (http://twitter.com/RadketRDK) giveaways, airdrops on the active community wallets, etc. Team: 10m $RDK Personal tokens of the two RadKET founders. Marketing & development: 4m $RDK These tokens can be sold for marketing or development. Example: advertising campaigns, server maintenance. Supported networks RadKET Marketplace supports multiple blockchain networks, allowing users to transact with various cryptocurrencies. While specific details may vary, here are some of the supported networks: Ethereum ($ETH) RadKET is built on the Ethereum blockchain and supports ERC-20 tokens. Users can transact with Ethereum and ERC-20 tokens on the marketplace. Binance Smart Chain ($BSC) RadKET also supports the Binance Smart Chain, allowing users to transact with cryptocurrencies that are built on BSC. Polygon ($MATIC) RadKET has integrated with the Polygon network to provide users with the option to transact with cryptocurrencies that are based on Polygon. Avalanche ($AVAX) RadKET has integrated with the Avalanche network, enabling users to utilize cryptocurrencies that are built on the Avalanche platform. These are some of the blockchain networks that RadKET Marketplace supports. Staking Process The staking process on the RadKET Marketplace involves staking the native token, $RDK, to earn rewards. Here's a general overview of the staking process: Obtain $RDK To participate in staking, you need to acquire $RDK tokens (https://docs.radket.com/radket/overview/usdrdk) . You can obtain these tokens through various means, such as purchasing them on supported exchanges or participating in token sales. Connect Wallet Connect your digital wallet to the RadKET Marketplace platform. Supported wallets may vary, but typically popular wallets like MetaMask are compatible. Access Staking Once your wallet is connected, navigate to the staking section on the RadKET Marketplace platform. This may be accessible through a specific tab or section on the website or user interface. Choose Staking Option Select the staking option that suits your preferences. RadKET Marketplace may offer different staking periods or tiers with varying rewards and lock-up periods. Choose the option that aligns with your staking goals. Confirm Staking Confirm the staking transaction by approving the staked amount of $RDK from your wallet. This will typically involve signing the transaction with your wallet's private key. Staking Period Once your tokens are staked, they will be locked for a specified staking period. During this time, you won't be able to access or transfer the staked $RDK tokens. Earn Rewards As you hold your $RDK tokens staked, you will start earning rewards over time. The specific rewards and earning rates are determined by the staking offering on RadKET Marketplace. Unstaking After the staking period is completed, you can initiate the unstaking process. This usually involves submitting an unstaking request and waiting for the predetermined cooldown period before your tokens are released. Claim Rewards Once your tokens are unstaked, you can claim the accumulated rewards. This process may vary, but typically involves confirming the claim transaction through your wallet. Security and Trust RadKET Marketplace is committed to providing a secure and trustworthy platform for its users. Here are some of the security measures and trust factors that RadKET Marketplace has implemented to ensure the safety of its users: Secure Infrastructure RadKET uses a secure infrastructure to ensure that user information and transactions are protected. They utilize the latest security technologies such as firewalls, encryption, and multi-factor authentication to help secure the platform and user data. KYC and AML RadKET Marketplace has implemented KYC (Know Your Customer) and AML (Anti-Money Laundering) policies to verify the identity of its users and prevent fraudulent transactions and activities. Smart Contract Audit RadKET has had its smart contracts audited by reputable third-party security firms to ensure that they are secure and free of vulnerabilities. Cold Storage To minimize the risk of loss or hacking of user funds, RadKET stores the majority of its users' funds in cold storage wallets that are physically offline, in addition to a hot wallet for liquidity purposes. Experienced Team The RadKET team comprises experts in blockchain technology, finance, and cybersecurity, which provides credibility and trust. Community Trust RadKET Marketplace has a growing community and active social media presence, with positive feedback and reviews from users. Open Communication RadKET Marketplace keeps its users informed about important updates or changes to the platform and is responsive to user feedback and concerns. Insurance RadKET has partnered with reputable insurance providers to protect its users' funds in case of a security breach or hacking incident. ## RadixUID URL: https://radix.wiki/ecosystem/radixuid Updated: 2026-02-06 Summary: RadixUID is a secure and user-centric identity management system built on the Radix blockchain. It aims to address the limitations of traditional identity syste RadixUID is a secure and user-centric identity management system built on the Radix blockchain. It aims to address the limitations of traditional identity systems by leveraging the power of blockchain technology. Overview RadixUID is a project within the Radix ecosystem. Specifically, RadixUID refers to a validator node called "Sofia Validator," which is one of the nodes operated by RadixUID. The Sofia validator consists of two servers hosted in a data center in Sofia, Bulgaria. It is run by a professional team and applies a 2% fee for all staking rewards. The backing assets of the Sofia validator are reserves used for trading to maintain the token's price stability. Features Key features and objectives of RadixUID include: Self-Sovereign Identity RadixUID allows individuals to have full control over their personal data and identity information. Users can securely manage and share their identity attributes without relying on centralized authorities. Interoperability RadixUID is designed to be interoperable, meaning the identities created using RadixUID can be used across different applications and platforms. This allows for seamless integration and sharing of identity information. Privacy and Security RadixUID prioritizes privacy and security, ensuring that users' identity information is stored and transmitted securely. The use of cryptographic algorithms and decentralized consensus protocols helps protect user data from unauthorized access. Application Integration RadixUID provides developers with the tools and infrastructure to integrate decentralized identity into their applications. This enables various use cases, such as secure login, authentication, and verifiable credentials. By offering a decentralized and self-sovereign identity solution, RadixUID aims to empower individuals, enhance privacy, and facilitate seamless and secure interactions in the digital world. Benefits RadixUID offers several benefit to users: Full Control Over Personal Data With RadixUID, individuals have full control over their personal data and identity information. They can choose which attributes to share, who to share them with, and for how long. No central authority or intermediaries are involved, providing a high level of privacy and security. Seamless Integration RadixUID is designed to be interoperable, allowing users to use their decentralized identity across multiple applications and platforms seamlessly. This eliminates the need to create different accounts for each service, reducing the risk of data breaches and identity theft. Improved Security RadixUID leverages the security features of the Radix blockchain, including strong cryptography and decentralized consensus protocols, to ensure that user data is tamper-proof and protected from unauthorized access. Greater Efficiency With RadixUID, individuals don't need to go through lengthy and time-consuming identity verification processes each time they sign up for a new service. Identity-related transactions are streamlined, increasing efficiency and reducing costs for both users and service providers. Access to New Services By using RadixUID, individuals can access new services that require verified identities, such as secure login and authentication, decentralized finance (DeFi), and digital voting systems. Overall, RadixUID offers a user-centric, secure, and self-sovereign identity solution that empowers individuals and facilitates trusted interactions in the digital world. ## RadixTalk URL: https://radix.wiki/ecosystem/radixtalk Updated: 2026-02-06 Summary: RadixTalk is an online forum community focused on discussing Radix. The forum was created by @0xMattia in 2021. RadixTalk is an online forum community focused on discussing Radix. The forum was created by @0xMattia (https://www.notion.so/Mattia-4edffe051d294347beb7feb147662129?pvs=21) in 2021. Overview Topics on RadixTalk span from technical discussions about building on Radix to conversations about staking, governance, and the Radix ecosystem. The forum has sections for general chat (https://radixtalk.com/c/chat-discussion/42/none) , questions (https://radixtalk.com/c/questions/35/none) , projects (https://radixtalk.com/c/projects/40/none) , developers (https://radixtalk.com/c/developers/24/none) , validators (https://radixtalk.com/c/validators/16/none) , staking (https://radixtalk.com/c/staking/27/none) , governance (https://radixtalk.com/c/governance/46) , announcements (https://radixtalk.com/c/announcements/31/none) , news (https://radixtalk.com/c/news/45) , and site feedback (https://radixtalk.com/c/site-feedback/2/none) . The forum is run by community volunteers along with support from RDX Works (RDX%20Works%208092803705114cd1b34bbef5dd98d7af.md) , the core development company behind Radix. As an open, decentralized community, RadixTalk aims to be a civil, friendly platform for public discussion related to Radix and its growing ecosystem. History RadixTalk was launched in 2021 shortly before the release of Radix's Alexandria (https://www.notion.so/Radix-Developer-Environment-Alexandria-ab78b3f478594b4d8905360c2fade517?pvs=21) testnet. The forum was created to provide a dedicated online space for the Radix community to come together and discuss the new protocol. As Radix progressed through its testnet releases of Alexandria (https://www.notion.so/Radix-Developer-Environment-Alexandria-ab78b3f478594b4d8905360c2fade517?pvs=21) , Olympia (https://www.notion.so/Radix-Mainnet-Olympia-8fa2123a247c4e60853253ca2ab3e40e?pvs=21) , and Babylon (https://www.notion.so/Radix-Mainnet-Babylon-2cdc168d56e04787a9f412c23560c4b0?pvs=21) , RadixTalk served as an important hub for developers, node runners, stakers, and other community members to collaborate. During the transition periods between testnet releases, the forum was instrumental in coordinating efforts among validators to migrate and upgrade. In the lead up to Radix's mainnet launch in 2023, RadixTalk saw expanded discussions around staking, governance, dApp development, and other topics critical to the nascent network. While still a young forum, RadixTalk has quickly become a vital meeting place for Radix's decentralized, global community. With Radix mainnet activated and additional growth on the horizon, the forum will likely continue to evolve as a key resource for community engagement, support, and coordination around the Radix network. Features RadixTalk includes several features to facilitate community discussion and engagement: - Topic Categories (https://radixtalk.com) - The forum has dedicated sections for conversations ranging from general chat to technical discussions on validators, developers, staking, governance and more. This allows users to easily find and participate in topics of interest. - Profiles - Users can create accounts and profiles to track activity and preferences across the forum. Profiles allow for reputation building and help identify expertise. - Upvotes/Downvotes - Users can upvote high-quality posts and downvote unhelpful posts. This community curation surfaces the best content. - Search - A search function allows users to find existing topics and information throughout the forum. This prevents duplicative posts. - Notifications - Users can opt into email and web notifications to stay updated on forum activity and conversations. - Chat - Real-time chat functionality facilitates quick conversations and community engagement. - Mobile Support - RadixTalk is mobile-friendly, enabling users to participate from phones and tablets. - Moderation - Community moderators and admin help maintain a constructive, friendly environment in line with forum rules. - Multilingual - The forum provides dedicated sections for major languages including Italian and German. ## RadixStake URL: https://radix.wiki/ecosystem/radixstake Updated: 2026-02-06 Summary: RadixStake is a platform that allows users to earn rewards in the form of Radix XRD tokens. RadixStake is a platform that allows users to earn rewards in the form of Radix XRD tokens. Overview RadixStake is a platform that allows users to stake their Radix (XRD) tokens and earn rewards. It utilizes server uptime monitoring to ensure optimal operations of validator nodes, providing constant network connectivity. RadixStake works with top-tier cloud infrastructure providers, such as DigitalOcean and Linode, to host primary and backup validator nodes. By staking their Radix tokens with validators like RadixStake, users can help secure the Radix ledger and protocol. The selection of validators is an important decision as it can impact the share of emission rewards from the Radix protocol. In December 2021, eligible delegators who staked a certain amount of XRD with RadixStake were eligible for the final Radical airdrop. The airdrop distribution was calculated based on the total number of eligible delegators and their XRD amount staked with RadixStake. Sources: RadixStake - Earn Radix XRD Rewards (https://www.radixstake.com/) Radix Staking and Incentive Rewards Guide (https://www.radixdlt.com/blog/radix-staking-and-incentive-rewards-guide) RadixStake Questions (https://www.radixstake.com/questions.html) Benefits If you join RadixStake, you can enjoy several benefits: Radix Mainnet Validator RadixStake has a qualified experience as a Radix Mainnet Validator, Betanet Validator, Stokenet Validator, and Cassandra Test Network node runner. Reliable Node Operations RadixStake provides 24/7 monitoring and alerting to ensure the optimal performance and uptime of their validator nodes. Earn Radix XRD Rewards By participating in staking with RadixStake, you can earn Radix (XRD) emissions reward. These rewards are automatically created in a staked state and are emitted pre-staked to the same validator where the originally staked XRD tokens were placed. You don't need to continuously stake emitted tokens to compound your rewards, as the emitted tokens immediately begin earning emissions. Competitive Fee RadixStake offers a fair and competitive fee of 1.49%. This fee is currently below the top 15 competing validator average and is designed to incentivize validators to perform well. Products RadixStake primarily offers a staking service for Radix (XRD) tokens. Here are the main products associated with RadixStake: Radix Staking RadixStake allows users to stake their Radix (XRD) tokens and earn rewards. By staking their tokens, users contribute to the security and operations of the Radix network while earning emissions rewards. Validator Node Hosting RadixStake works with top-tier cloud infrastructure providers, such as DigitalOcean and Linode, to host primary and backup validator nodes. This ensures constant network connectivity and optimal operations of the validator nodes, Staking Process The staking process with RadixStake typically involves the following steps: Obtain Radix ($XRD) Tokens First, you will need to obtain Radix ($XRD) tokens. You can acquire $XRD tokens through exchanges or other means, ensuring that you have the required amount for staking. Choose a Validator Once you have $XRD tokens, you need to select a validator to delegate your tokens to. RadixStake is a reputable validator provider that you can consider for this purpose. Delegate Your Tokens Transfer your $XRD tokens to the chosen validator's staking address. This process typically involves creating a transaction from your wallet to the specified staking address provided by the validator. Wait for Confirmation After delegating your tokens, you will need to wait for the transaction to be confirmed on the Radix network. This confirmation process involves the network validating and adding your transaction to the blockchain. Start Earning Rewards Once your transaction is confirmed, your tokens will start participating in the staking process. As a validator, RadixStake will handle the technical aspects of running the node and securing the network. You will start earning emissions rewards in the form of additional $XRD tokens based on your staked amount. Manage Your Staked Tokens You can choose to compound your earnings by staking the emitted tokens or withdraw them, depending on the staking protocols and rules set by RadixStake. ## RadixScan URL: https://radix.wiki/ecosystem/radixscan Updated: 2026-02-06 Summary: RadixScan is a web service that provides detailed and comprehensive information about the Radix network. The service offers an array of features that includes a RadixScan is a web service that provides detailed and comprehensive information about the Radix (https://www.radixdlt.com/) network. The service offers an array of features that includes an archive node, partnerships, and data about the Radix network. History RadixScan.com (http://RadixScan.com) was gifted to the team by Wylie from Radnode.io (Radnode%20f20371f5d2404d49a7483625c4e7d210.md) and XRD Domains (XRD%20Domains%20c66e0aea7cf4481baacdfcc4a9a490b7.md) . Services The RadixScan platform provides various services related to the Radix network. Some of the available tools on the platform include an Archive Node, Transaction Exporter, and Stake Exporter. EmmogluStakery Archive Node The RadixScan operates an archive node (https://www.radixscan.io/EmmogluStakeryArchiveNode.shtml) within the Radix mainnet. The archive node is used for running queries for RadixScan but is also open for use by others. However, it has not yet been switched to the gateway system, making it unsuitable for wallet use. The node provides users with detailed instructions on how to set up an alternative node in their wallets. Stake Exporter The RadixScan team has also developed a ‘Stake exporter’. This service provides a detailed analysis of a user's staking address for a small fee of 1 $XRD (Radix's native token). The results of the analysis are uploaded to IPFS and the link to the results is sent to the user in an encrypted message. Partnerships RadixScan has established partnerships with Clana, Radix Talk, Xidar (Xidar%2093bbfb64b3384865ba3c8cd989d6f8f1.md) , Radix List (Radix%20List%20aef60c7dca40425b8bc25982bb882dcc.md) , RADIX.wiki (https://www.notion.so/fb6d07f4e5ef45e09d83cc099e75faa8?pvs=21) , Radix Name Service (XRD%20Domains%20c66e0aea7cf4481baacdfcc4a9a490b7.md) , Hermes Protocol (Hermes%20Protocol%206febd13a64a94d1f8304b0efe490fb21.md) , and Z3US (Z3US%2076d2ff8538de46c3b8734f552b4cf203.md) among others. They are also collaborating with RadixCharts (RadixCharts%207d3e4d3e34404029940fc25821dc79d4.md) to provide aggregated data for the Radix network. These collaborations aim to be the primary data source for all Radix users. Founders RadixScan was started by Michael, from Vorarlberg in Austria, who has been interested in crypto since 2015. He was fascinated by the possibilities of decentralization, democratization of finances, and particularly, the potential for anonymization. He started the project RadixScan because he believed in the potential of Radix to grow significantly. He also mentioned appreciating the Radix community for its humor and the helpful exchanges he's had with members 2 (https://radixscan.io/AboutUs.shtml) . Leo, from Rio de Janeiro, Brazil, joined the project after meeting Michael in the Radix community. He had been interested in the concept of decentralization since he was 17 and was drawn to crypto in 2017. When he encountered Radix in June 2019, he was immediately intrigued by the project's intellectual rigor and the technological advancements it presented compared to other Distributed Ledger Technologies (DLTs). He and Michael participated in Cassandra tests and decided to run the Emmoglu Stakery together after the Betanet phase 2 (https://radixscan.io/AboutUs.shtml) . Contact - Michael: t.me/chatMichael, Michael@RadixScan.io (mailto:Michael@RadixScan.io) - Leo, Emmoglu Stakery: t.me/EmmogluStakery (http://t.me/EmmogluStakery) ## RadixRadio URL: https://radix.wiki/ecosystem/radixradio Updated: 2026-02-06 Summary: RadixRadio is a Radix-focused podcast hosted by Jimmy Humania, having been founded by long-time Radix community member @kilovoltage. The podcast launched in lat RadixRadio is a Radix-focused podcast hosted by Jimmy Humania (https://www.notion.so/Jimmy-Humania-0b1bf8637d154959a8c925d533e23ddf?pvs=21) , having been founded by long-time Radix community member @kilovoltage (https://twitter.com/realKilovoltage) . The podcast launched in late 2022 as a way to highlight new projects being built on Radix and interview influential figures within the Radix community. RadixRadio Episode 1: Radish Eco “Mik” @BonaFidePlug (https://youtu.be/Gjwe-VWsHSk?si=ysNVcVbPUUFdUUVq) RadixRadio Episode 1: Radish Eco “Mik” @BonaFidePlug (https://twitter.com/BonaFidePlug) Overview New episodes of RadixRadio are released on a weekly basis, with most following an interview format. Kilovoltage will typically speak in-depth with founders of Radix-based projects or other important voices in the community. There is also audience participation during the live recording of episodes, with listeners able to join the discussion and ask questions. The podcast aims to provide a unique opportunity for community engagement within the Radix ecosystem. By interviewing project leaders and discussing the latest developments, it helps keep the community informed and connected. RadixRadio also serves as an educational resource, with discussions exploring the technical workings of Radix and its advantages over other blockchains. Since its launch, RadixRadio has developed a dedicated following within the Radix community. Episodes regularly garner hundreds of listens across platforms like Twitter, YouTube, and podcast services. As the Radix ecosystem continues to grow leading up to the Babylon smart contract release in 2023, RadixRadio looks to establish itself as a go-to source for staying up-to-date. Format RadixRadio recordings take place via the Twitter Spaces feature. This allows kilovoltage to host live voice conversations focused around a particular topic or guest. Listeners can tune into the Twitter Space live and participate in the discussion. After the initial recording, the audio from the Twitter Space is extracted and uploaded onto other platforms like YouTube, Spotify, Apple Podcasts etc. This allows the episodes to remain available after the 30-day expiration period for Twitter Spaces. The majority of RadixRadio episodes follow an interview format, with kilovoltage speaking one-on-one with a guest. These guests are typically founders of projects built on Radix or influential voices within the Radix community. In addition to the interview content, listener participation and questions are a key component of the podcast. Those tuning into the Twitter Space recordings are able to raise their hand and be brought on air by kilovoltage to ask questions or share perspectives. This interactivity is a core part of what makes RadixRadio unique. The free-flowing discussion format enabled by Twitter Spaces gives RadixRadio a conversational tone. Kilovoltage aims to make guests feel comfortable opening up, while still extracting key insights about their projects and Radix involvement. This casual yet informative style has resonated with listeners. Episodes RadixRadio has featured interviews with a diverse range of guests from across the Radix ecosystem since launching in late 2022. Some notable episodes include: - Episode 1: Interview with Radish Eco founder "Mik" about his NFT project, plans for future roadmap and utilities, rights for NFT holders, and journey into the Radix community. - Episode 2: Interview with Robo Creator about his Radical Robos generative NFT collection. Discussed original 3D art process, community-focused approach, plans to incorporate robo characters into future metaverse and gaming concepts. - Episode 3: Interview with founders of Radical Penguins NFT project Ice Penguin and Pengu Creator. Talked about being first 3D NFT on Radix, plans for NFT marketplace, charity donations, roadmap. - Episode 4: Interview with Vlad B about his projects Nerds Republic and Foton NFT marketplace. Discussed origins of nerd avatars, plans to integrate related metaverse concepts, early stages of Radix involvement. - Episode 5: Interview with founders of Bobcat Society NFT project Baron and Lynx. Discussed decision to switch from Solana blockchain to Radix, community differences, project roadmap and vision. - Episode 6: Interview with founder of Roidboiz NFT project Radix Enigma. Discussed generative art process, roadmap, traits, plans to collaborate with other Radix projects. - Episode 7: Roundtable discussion with Radix community leaders B Like Water, Bobby Sizemore, Jordan T Jones, and Squidward XRD. Talked history in Radix, approach to community building, outlook on future growth. The wide range of projects and personalities covered demonstrates the expanding diversity of the Radix ecosystem. Hosts Jimmy Humania Jimmy Humania (https://www.notion.so/Jimmy-Humania-0b1bf8637d154959a8c925d533e23ddf?pvs=21) Kilovoltage RadixRadio was founded and hosted by @kilovoltage (https://twitter.com/realKilovoltage) . He has been an active member of the Radix community for over a year, joining in late 2021. Kilovoltage is based in the southwestern United States. His interest in Radix began after learning about it on Reddit and joining the early Telegram community. He was attracted by the technical ambitions of the project and the helpfulness of early supporters. In addition to hosting RadixRadio, kilovoltage serves as a volunteer ambassador for Radix. He aims to help grow the ecosystem by promoting projects and organizing real-world events. His passion for the community and networked nature of Radix are what motivated him to start the podcast. Reception Since launching in late 2022, RadixRadio has been well-received within the Radix community. It has become a popular medium for projects to reach the community and build connections between different ecosystem participants. Episodes of the podcast consistently garner hundreds of listens across platforms like Twitter, YouTube, and podcast services. The interactive live format provides a unique opportunity for community engagement not found on most blockchain podcasts. As both the Radix technology and ecosystem continue maturing, RadixRadio looks poised to establish itself as a leading source for discussion and updates. With the core community rapidly expanding heading into 2023, the podcast can help onboard new members and promote cohesion. ## RadixPlanet URL: https://radix.wiki/ecosystem/radixplanet Updated: 2026-02-06 Summary: RadixPlanet is a decentralized cryptocurrency exchange (DEX) built on the Radix network. It was developed by the RadixPlanet team to provide a fast, secure, and RadixPlanet is a decentralized cryptocurrency exchange (DEX) built on the Radix network. It was developed by the RadixPlanet team to provide a fast, secure, and easy way to trade tokens on the Radix ledger. https://youtu.be/fYNlQ0AHbAA?si=7O8kn0TpqzNt-17w (https://youtu.be/fYNlQ0AHbAA?si=7O8kn0TpqzNt-17w) Introduction The RadixPlanet DEX launched in 2023 and utilizes the unique capabilities of the Radix protocol to offer features not found in other DEX platforms. Key highlights of RadixPlanet include: - Support for creating liquidity pools with flexible rules and incentives - Options for public and private liquidity pools - Liquidity mining rewards distributed based on trading volume to incentivize providing liquidity where it is most useful - Configurable swap fees and royalty fees to support development - An integrated treasury to fund future development through collected fees - A two-way Bitcoin bridge to allow wrapping Bitcoin for use on Radix The RadixPlanet DEX aims to leverage the scalability, security, and composability of the Radix network to create a next-generation DEX for trading tokens and providing liquidity. The project is led by the RadixPlanet team and supported by the Radix community. Features Full-Range Pools RadixPlanet utilizes full-range liquidity pools at launch. Full-range pools allow the price of tokens within the pool to fluctuate freely based on supply and demand, from zero to infinity. This helps ensure that liquidity does not completely dry up on one side of the trading pair. Full-range pools adhere to the standard formula: x * y = k, where x and y represent the available liquidity of the two tokens and k is a constant. This formula maintains the invariant relationship between the two sides of the pool as trades occur. Compared to concentrated liquidity models, full-range pools carry the risk of larger impermanent loss for liquidity providers due to higher price volatility. But they allow any price range to be supported. Multi-Token Pools RadixPlanet also supports multi-token liquidity pools at launch. Multi-token pools contain three or more tokens rather than the typical two tokens. When executing a trade, two tokens within the pool are selected. The smart contracts will handle the pricing and exchange between those two tokens based on their available liquidity in the pool, while the other tokens are unaffected. Multi-token pools provide flexibility in the trading pairs that can be supported. They also reduce the need for multiple two-token pools between a group of tokens. However, they can introduce additional complexity in valuing the underlying assets. Private/Public Liquidity RadixPlanet allows liquidity pool creators to configure pools as either public or private. This provides flexibility in how pools can be structured. Public pools allow anyone to provide liquidity. They offer full transparency to liquidity providers, as the parameters of the pool cannot be changed after creation. Public pools help attract liquidity by being open and non-exclusive. Private pools require approval to provide liquidity. Only addresses that have been granted a pool membership by the manager can add funds. Private pools give the creator more control over the liquidity sources and management. They can be useful when trying to limit access or create exclusive trading environments. Adding Liquidity Adding liquidity to pools on RadixPlanet is handled through Non-Fungible Tokens (NFTs) representing liquidity positions rather than typical fungible LP tokens. When first supplying assets to a pool, a new NFT “liquidity badge” is minted to uniquely represent the position. Users can later add to an existing position to increase the amount of liquidity represented by that NFT. This helps simplify tracking returns from a given liquidity amount over time. NFT liquidity positions integrate with other signature RadixPlanet features like withdrawable earnings. Owners can manage or move positions while retaining associated earnings claims from collected fees. Customizable Fee Rules RadixPlanet allows custom fee rules to be set on new liquidity pools. Creators can choose different fee types, collection methods, and beneficiary addresses to incentivize providing liquidity tailored to their use case. Available fee rule configurations include: - Fee Type - Percentage of trade amount - Fixed token amount - Collection Method - Add to liquidity - Route to fee address - Fee Token - Specify which token fee is collected in - Fee Amount - Percentage or fixed token quantity Multiple fee rules can be stacked to split fees between liquidity providers, pool creators, treasury deposits, etc. Flexible fee rules allow unique liquidity pool configurations and incentives. Automatic Re-Investing Fee rules on RadixPlanet can enable automatic re-investing of collected fees back into liquidity pools. By using an "Add to Liquidity" collection method, swap fees get routed directly back into the pools they came from rather than external addresses. This incrementally increases liquidity over time through usage without manual claiming needed. Automatic re-investing helps reduce impermanent loss, boosts liquidity depth, and can increase volume as fees fund further trades. Pools with active trading volume particularly benefit from compounding effects over time. Withdrawable Earnings RadixPlanet allows pool creators to configure fee rules with "Withdrawable Earnings" collection methods. This enables liquidity providers to claim fees without reducing their position sizes. Rather than going directly back into pools, fees sent to Withdrawable Earnings get routed to a separate vault. Liquidity providers can then periodically claim their pro-rata portion of fees based on their share of the pool, without impacting their underlying liquidity. By retaining principal while accessing returns, providers can compound gains over longer periods. This helps reduce impermanent loss risks from reducing positions after price volatility. Periodic withdrawals also give incremental income without needing to exit positions fully. Liquidity Mining Rewards RadixPlanet distributes volume-based liquidity mining rewards using unique NFT voucher tokens. These vouchers - Planet_LM - are minted on every trade and deposited to liquidity providers to later redeem for the native Planet token. 10x Planet_LM vouchers enter pools per trade while 1x goes to the trader. This incentivizes providing liquidity in active pools where rewards accrue more quickly from higher volumes. Periodically, vouchers can be redeemed for Planet tokens, similar to many "staking" mechanisms. Volume-based rewards promote network effects and compliment liquidity provision where it is most useful. As more activity occurs in a pool, providers earn greater rewards from the increased trades. This aligns incentives across traders and liquidity suppliers. Protocol Fees RadixPlanet charges protocol fees on trades to support ongoing development and sustainability of the DEX. These include: Swap Fees All pools have a 0.2% swap fee that gets split between the RadixPlanet team and the platform treasury. These small fees on every trade provide revenue to expand features, integrate partners, and improve the trading experience. Royalty Fees An additional royalty in XRD is levied based on the protocol version the pool resides on. This aligns incentives for users to upgrade to the latest contracts for better fees and features. - v1.0.0 pools: $0.02 per trade - v1.1.0 pools: $0.03 per trade - v1.1.1 pools: $0.03 per trade Royalty fees go to the RadixPlanet team and are used to fund ongoing development. Treasury The RadixPlanet treasury accumulates fees from protocol charges to fund community initiatives. Governance features will help determine allocation of treasury funds to grow the platform and ecosystem. Revenue directed to the treasury comes from: - 0.2% of Swap Fees from all pools - 100% of Royalty Fees from protocol upgrades - Any donations from partners/users The decentralized treasury will allow $PLANET holders to guide budget decisions. Transactions Manifests To interact with the RadixPlanet DEX protocol, wallet transactions require specialized manifests to encode instructions properly. Common manifests used include: - Create pool - Add liquidity - Remove liquidity - Swap tokens - Claim fees - Redeem vouchers These allow end users to access liquidity pools, supply assets, trade tokens, earn yields, and more. RadixPlanet provides template transaction manifests for these core functions that developers can use to build customized experiences via alternative frontends. Having ready-made manifests speeds up integration. For example, the add liquidity manifest encodes details like: - Pool ID - Token types & amounts - Whether to create a new position - Fee rules acceptance With template manifests available, developers can focus more on crafting compelling interfaces rather than low-level protocol interactions. Bitcoin Bridge RadixPlanet offers a two-way bridge for Bitcoin between the Radix network and other blockchains. This allows users to wrap their BTC for use directly in DeFi applications on Radix. The Bitcoin bridge locks up native BTC and mints a wrapped token (wBTC) to represent Bitcoin on Radix. The wBTC can then be traded in DEX pools, lent, used as collateral, and more - unlocking Bitcoin to work inside the Radix DeFi ecosystem. The process can also be reversed by burning wBTC in exchange for unlocking native BTC back to its original blockchain. This provides liquidity for Bitcoin holders to move between networks while accessing Radix-based financial transactions. Initially, the Bitcoin bridge only supported manual wrapping/unwrapping of BTC with assistance from the RadixPlanet team. Over time, features will expand for direct cross-chain interoperability, starting with public canisters to automate the minting/burning of wrapped tokens. Native support for major external assets like Bitcoin illustrates the flexibility and hybrid capabilities of the Radix protocol for connecting external blockchains to its DeFi platform. ## Radixnode.io URL: https://radix.wiki/ecosystem/radixnodeio Updated: 2026-02-06 Summary: Radixnode.io is a website that provides safe staking services on the Radix network. It offers high-performance community-based nodes owned and operated by longs Radixnode.io is a website (http://radixnode.io/) that provides safe staking services on the Radix network. It offers high-performance community-based nodes owned and operated by longstanding members of the community, JoshC and Slightlyiffy (https://www.notion.so/Slightlyiffy-678aacf710e64a3da321112aa1cd3eea?pvs=21) . Overview They utilize industry-standard security redundancy and monitoring practices to provide the best experience for their delegates. In terms of performance, Radixnode.io uses NVMe storage, EPYC CPUs, and at least double the recommended RAM to ensure optimal performance. They guarantee a minimum uptime of 99.9%. Team Radixnode.io consists of two community members, namely JoshC and Slightlyiffy. Here's some information about them: Slightlyiffy Slightlyiffy is a Genesis crew member and a moderator of the official Radix Discord. They have been part of the Radix community since the inception of eMunie and have 15 years of experience in server administration. Their involvement in the community and server administration expertise make them well-suited to handle the responsibilities of operating Radixnode.io. JoshC JoshC joined Radix in 2016 and has been an active community member in the Radix Discord since then. While specific details about their role in Radixnode.io are not mentioned in the scrape, it is mentioned that JoshC is the Community Lead for Behodlerio. This demonstrates their dedication to advancing decentralized finance (DeFi) and bringing it to the mainstream. Together, JoshC and Slightlyiffy form the team behind Radixnode.io. They are committed to serving their delegators and the Radix community, ensuring constant monitoring, maximum uptime, and security of the nodes. Their involvement in the Radix ecosystem and experience make them capable partners in building a strong DPoS network for the Radix network. Vision The team at Radix Node is committed to building a strong DPoS (Delegated Proof of Stake) network filled with honest and committed node-runners. They see Radix as a platform that can bring about a decentralized and fairer financial system. By operating community-owned nodes and ensuring constant monitoring, maximum uptime, and security, Radix Node is dedicated to serving its delegators and the Radix community. They commit to being transparent with delegators and operating nodes at a fair rate of 25%. Overall, their vision is aligned with creating a decentralized financial ecosystem that provides greater accessibility and fairness to individuals. Benefits If you join Radixnode.io as a delegator, you can expect several benefits: Community Ownership Radixnode.io is owned and operated by longstanding members of the Radix community, JoshC and Slightlyiffy. By joining Radixnode.io, you become part of a community-driven initiative, contributing to the growth and development of the Radix network. Security Radixnode.io prioritizes node security and has implemented top-level security measures to ensure the safety of delegators' staked assets. They utilize industry-standard security redundancy and monitoring practices, providing delegates with a secure staking experience. Performance Radixnode.io operates high-performance community-based nodes. They have chosen suppliers and specifications, such as NVMe storage, EPYC CPUs, and double the recommended RAM, to ensure optimal performance. This means you can expect fast and reliable staking services. Easy Staking Process Staking with Radixnode.io is user-friendly and accessible. You don't need special technical knowledge, your own data center, or 24/7 network control. Simply stake your XRD tokens via the Radix wallet, and Radixnode.io takes care of the rest, allowing you to earn staking rewards hassle-free. Experienced Team The Radixnode.io team consists of JoshC and Slightlyiffy, who are longstanding members of the Radix community. They have extensive experience in server administration and actively contribute to the Radix ecosystem. By joining Radixnode.io, you benefit from the expertise and dedication of this experienced team. Commitment to Delegators Radixnode.io is committed to serving its delegators to the fullest extent. They commit to constant monitoring, maximum uptime, and ensuring security for delegators' staked assets. Additionally, they operate nodes at a fair rate of 25% and are transparent with delegators about any necessary changes. Security Here are some key points related to the security (https://www.radixnode.io/security) measures implemented by Radixnode.io: Node Security Radixnode.io emphasizes the importance of node security and has taken every possible step to ensure top-level security and resilience against attacks. They utilize industry-standard security redundancy and monitoring practices to provide the best experience possible for their delegates. Secure Infrastructure The website mentions that Radixnode.io has chosen suppliers and specifications for their validators to be as performant as possible. This includes using NVMe storage, EPYC CPUs, and double the recommended RAM. This focus on hardware performance ensures a secure and stable infrastructure for staking on the Radix network. Commitment to Delegators' Security Radixnode.io commits to serving their delegators to the fullest of their ability and provides constant monitoring, maximum uptime, and security. This commitment ensures that delegators' staked assets are protected and secure. ## RadixCharts URL: https://radix.wiki/ecosystem/radixcharts Updated: 2026-02-06 Summary: RadixCharts is an analytics platform that provides real-time insights into the Radix ecosystem. Founded in 2022, it offers various statistics and data visualiza RadixCharts (https://radixcharts.com/project) is an analytics platform that provides real-time insights into the Radix ecosystem. Founded in 2022, it offers various statistics and data visualizations to help users understand the performance and growth of the Radix network. The platform showcases information about token prices, decentralized applications (dApps), transaction volume, staking statistics, and social media presence. https://youtu.be/hQNCyNCR0D8 (https://youtu.be/hQNCyNCR0D8) History The inspiration for RadixCharts came from the Radix community, which manually checked telegram group members to create rankings for NFT projects. Recognizing the need for automation and having a passion for data and statistics, the project was initiated in September 2022. Since its inception, RadixCharts' services have expanded considerably. Features - Total Value Locked (TVL): One of the primary metrics provided by RadixCharts is the Total Value Locked (https://docs.radixcharts.com/data-analytics/total-value-locked) in the Radix network. TVL is a crucial indicator of the network's health and the trust users have in it. - $TOP Token: The $TOP token (https://docs.radixcharts.com/token/usdtop) is utilized as an incentive for Radix influencers. It has a limited supply of 11,000 to maintain its exclusivity. - Validator Insights: RadixCharts offers detailed information on network validators. This feature is essential for those interested in staking their XRD. It provides intricate data on each validator's performance, uptime, and other critical statistics. Team RadixCharts was founded by Nelly Sayon. https://youtu.be/pAcoCAOUJLA?si=feaf3dlLcN9_zspw (https://youtu.be/pAcoCAOUJLA?si=feaf3dlLcN9_zspw) Achievements and Recognition - Scrypto Portfolio Challenge (September 2022): Nelly, the founder of RadixCharts, participated in the Scrypto Challenge, focusing on portfolio management and yield farming. After a few weeks of learning Scrypto, she submitted her project, "Juicy Yields," and received an honorable mention for her blueprint. - Dandelion Program Grant (January 2023): The Radix Ecosystem Metrics page (https://www.radixdlt.com/blog/runs-on-radix-q-a-radixcharts) on RadixCharts was developed as a result of a Dandelion Program grant. This initiative by RDX Works rewarded individuals with innovative marketing ideas. RadixCharts contributed by offering transparent, real-time visualizations of economic activities on the Radix platform, shedding light on the health and growth of the Radix ecosystem. - Developer Incentive (September 2023): Nelly, the founder, was awarded $1,500 of XRD as part of the Scrypto Developer Incentive program. The RDX Works team recognized her contributions to RadixCharts, which has become an integral tool for data retrieval and visualization. Partnerships - RadixScan: RadixCharts collaborates with RadixScan (/ecosystem/radixscan) , a community explorer for the Radix network. RadixScan offers tools and extensive information on network activity, validators, tokens, and more. - Crew Labs: For their Validator Node (https://docs.radixcharts.com/node/validator-node) , RadixCharts has partnered with Crew Labs. This collaboration involves a joint venture of innovators and entrepreneurs who are developing Web3 solutions. ## RADIX REVIEW URL: https://radix.wiki/ecosystem/radix-review Updated: 2026-02-06 Summary: 💰 Assets:: Status: 🟢 https://x.com/felixxrd (https://x.com/felixxrd) 💰 Assets:: https://ociswap.com/resource_rdx1thmvxfv5yngqf29vv58cm222mp9r5fxe5yap6w0dqzudrzz9yjxn86 (https://ociswap.com/resource_rdx1thmvxfv5yngqf29vv58cm222mp9r5fxe5yap6w0dqzudrzz9yjxn86) Status: 🟢 The #1 Community Podcast on Radix Welcome to XRD's most popular podcast! Radix Review focuses on highlighting the Radix blockchain and is recorded LIVE every Friday at 7PM UTC on X and is available on all major podcast platforms Follow our hosts on X to be notified of live recordings! https://x.com/NFTMachinist (https://x.com/NFTMachinist) https://x.com/FelixXRD (https://x.com/FelixXRD) 🔥Saving The World One Recording At A Time🔥 Radix Review is for ENTERTAINMENT PURPOSES ONLY! Always do your own research and invest at your own risk. ## Radix List URL: https://radix.wiki/ecosystem/radix-list Updated: 2026-02-06 Summary: Radix List is an online platform dedicated to the exploration of community projects developed on the Radix network. The website features a wide variety of initi Radix List is an online platform dedicated to the exploration of community projects developed on the Radix network. The website features a wide variety of initiatives, including token giveaways, validator services, Non-Fungible Tokens (NFTs), Play-to-Earn (P2E) games, and more. Radix List was created by Chris from pica.finance and Kansuler (https://www.notion.so/Kansuler-f81f481baf9448dc92388bbd082126b2?pvs=21) . Features Project Listings Radix List provides a comprehensive listing of various projects from the Radix community. Some of the listed initiatives include: - EasyMoon: A token giveaway project offering free tokens every Monday. - RADIATOR: A value transfer optimization initiative with loyalty programs, cashbacks, and more. It includes a low-fee validator node. - Blue Chick NFTs: A collection of unique, procedurally generated NFTs in different rarities. - Arcane Labyrinth: A Play-to-Earn game developed on top of RadixDLT technology, offering engaging and exciting gameplay. - PYROS WORLD: A project offering metaverse-ready 3D NFTs on Radix DLT. - RadixEcosystem: A place to find information about projects on Radix List. - DeXian Staking Earning: A liquid staking protocol providing one-stop Fast Stake and Fast Unstake services. - PourpoCats: A collection of 10,000 unique pixel art Pourpocats using favorite tokens. - Radical Staking: An OG validator with solid performance, lotteries, giveaways, and special promotions for delegators. - AlphaDEX: A scalable on-ledger order book exchange platform. - Radix Jobs: A service connecting employers with job seekers on Radix, allowing payments in XRD. - SRWA: Builders using Radix and Scrypto technology for RWA and lending. Notifications Radix List offers live push notifications to its users when new listings are added to the platform. Users can subscribe to the Radix List announcement channel to receive these updates. ## Radix Labs URL: https://radix.wiki/ecosystem/radix-labs Updated: 2026-02-06 Summary: Radix Labs is the research and development arm of the Radix Foundation, dedicated to advancing distributed ledger technology with a particular focus on scalabil Radix Labs is the research and development arm of the Radix Foundation (https://www.radixdlt.com/labs) , dedicated to advancing distributed ledger technology with a particular focus on scalability solutions. https://www.youtube.com/live/fF7KfHzVOoc (https://www.youtube.com/live/fF7KfHzVOoc) Overview Radix Labs works closely with corporations and institutions to develop solutions that shape the future of distributed systems and decentralized finance. A key focus of their research has been the development of consensus mechanisms that can support high transaction throughput while preserving atomic composability (https://www.radixdlt.com/blog/hyperscale-alpha---part-i-the-inception-of-a-hybrid-consensus-mechanism) - a critical feature for decentralized financial applications. Under the leadership of Dan Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) , Radix Labs has demonstrated breakthrough performance metrics, achieving up to 3.2 million transactions per second (https://www.radixdlt.com/labs) in their Hyperscale platform testing. This represents a significant advancement compared to other distributed ledger technologies, with their testing showing performance approximately 20 times greater than competing platforms. The lab's research focuses on solving fundamental blockchain scalability challenges through innovations like Cassandra, Cerberus, and the Radix Engine (https://www.radixdlt.com/blog/radix-labs---hyperscale-alpha-test) . Their approach combines aspects of traditional Byzantine Fault Tolerant (BFT) protocols with novel consensus mechanisms, aiming to achieve unlimited linear scalability while maintaining network security and decentralization. A distinguishing aspect of Radix Labs' work is their commitment to academic rigor and transparency. Their consensus theory and atomic cross-shard consensus mechanisms have been peer-reviewed and mathematically verified by researchers at the University of California, Davis (https://www.radixdlt.com/labs) , establishing a strong theoretical foundation for their technological implementations. History The origins of Radix Labs' research can be traced back to 2012 (https://www.radixdlt.com/labs) , when initial investigations into blockchain scalability began with a modified Bitcoin fork. These early experiments aimed to discover realistic upper bounds for transaction processing, achieving approximately 1,000 transactions per second (TPS) through various optimizations and improvements. In 2013, the team developed the concept of "Block Trees," introducing a tree structure of blockchains that enabled concurrent transaction processing with delayed state reconciliation (https://www.radixdlt.com/labs) . This innovation pushed performance to around 2,000 TPS. By 2014, research had progressed to implementing Directed Acyclic Graphs (DAGs), which facilitated easier sharding and reduced messaging overhead, reaching approximately 2,500 TPS. A significant breakthrough came in 2017 with the development of CAST (Cooperative Audit State Transfer) (https://www.radixdlt.com/labs) , which separated state and transactional data. This architectural change improved the management of messaging overhead and enhanced security against double-spend attacks, though it temporarily reduced maximum throughput to maintain these security properties. 2020 marked a pivotal year with two major developments (https://www.radixdlt.com/labs) . First, the introduction of Tempo completely reimagined state representation and sharding design, dramatically reducing processing and reconciliation overhead to achieve 1.4 million TPS. Later that year, the development of Cerberus began, introducing a complementary consensus mechanism that enabled secure cross-shard communication while preserving network properties. The network saw its first practical implementation in 2022 with the Olympia launch (https://www.radixdlt.com/labs) , which utilized a hybrid consensus mechanism combining Hotstuff and Cerberus. This was followed by the Babylon release in 2023, which introduced secure and efficient decentralized applications (dApps) through the Scrypto programming language (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) , along with an integrated user experience. As of 2024-2025, Radix Labs has entered the Hyperscale Alpha phase (https://www.radixdlt.com/blog/radix-labs---hyperscale-alpha-test) , conducting groundbreaking tests targeting sustained performance of over 1 million complex transactions per second using a globally distributed validator network. This research network, formerly known as Cassandra (https://www.notion.so/Cassandra-fdf402d1427e4a42a366b74ae40b6c86?pvs=21) , represents a crucial step toward achieving unlimited linear scalability while maintaining atomic composability and decentralization. Technology Hyperscale Architecture Radix Labs' Hyperscale architecture represents a breakthrough in distributed ledger technology, combining a hybrid consensus mechanism that integrates both proof-of-work and proof-of-stake systems (https://www.radixdlt.com/blog/hyperscale-alpha---part-ii-design-principles) . The architecture is built on two primary components: the Cassandra consensus protocol for intra-shard consensus and the Cerberus protocol for cross-shard communication. The Cassandra consensus protocol manages consensus within individual shards (https://www.radixdlt.com/blog/hyperscale-alpha---part-i-the-inception-of-a-hybrid-consensus-mechanism) , ensuring progress even during adverse network conditions while providing strong safety guarantees when operating below its failure bound. This protocol innovates on traditional Byzantine Fault Tolerant (BFT) systems by incorporating elements of Nakamoto consensus, allowing for both weak liveness and deterministic safety. A unique feature of the architecture is its implementation of cryptographic sortition (https://www.radixdlt.com/blog/hyperscale-alpha---part-iii-integrating-sortition) , which enables efficient validator selection while maintaining decentralization. The system selects primary and secondary proposers through a verifiable random function, with the number of primary proposers calculated as the square root of the total participant count. Key Features The Hyperscale architecture achieves unprecedented performance metrics, demonstrating capacity for up to 3.2 million transactions per second (https://www.radixdlt.com/labs) , significantly outperforming other distributed ledger platforms. This performance is achieved while maintaining atomic composability across shards, a critical feature for complex decentralized financial applications. The system's hybrid consensus mechanism provides several key advantages (https://www.radixdlt.com/blog/hyperscale-alpha---part-ii-design-principles) : - Enhanced security through a dual sybil resistance model requiring control of both computational power and stake for successful attacks - Flexible participation options allowing validators to contribute through either mining or staking - Built-in protection against "nothing at stake" problems through proof-of-work requirements - Improved double-spend protection through the combination of stake-based voting and computational work The architecture employs a sophisticated approach to network coordination (https://www.radixdlt.com/blog/hyperscale-alpha---part-iii-integrating-sortition) , using explicit voting and side-channel communication for efficient consensus building. Quorum certificates provide cryptographic proof of validator agreement, while a two-phase commit process ensures strong safety guarantees without compromising system liveness. A notable innovation is the system's ability to maintain atomic composability at scale (https://www.radixdlt.com/blog/radix-labs---hyperscale-alpha-test) , allowing complex, multi-step transactions to execute as single, instant operations across different shards. This capability is particularly crucial for decentralized finance applications, where multiple actions often need to be executed atomically to maintain transaction integrity. Development Roadmap Current Phase (2024-2025) The Hyperscale Alpha testing phase represents a crucial milestone in Radix Labs' development (https://www.radixdlt.com/blog/radix-labs---hyperscale-alpha-test) . During this period, the network aims to demonstrate sustained performance of over 1 million complex transactions per second using a globally distributed validator set. Unlike typical blockchain performance tests, this phase emphasizes real-world conditions with all security features enabled and validators distributed across multiple continents. The Cassandra research network serves as a testing platform through most of 2025 (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) , supporting critical development work including: - Community-scale testing initiatives in December 2024 - Additional third-party testing throughout H1 2025 - Integration of the Radix Engine by Q3 2025 - Comprehensive community testing of the combined Cassandra and Radix Engine systems in Q4 2025 The Radix Engine enhancement track (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) runs parallel to Cassandra development, with key milestones including: - Planning of sharding upgrade architecture in Q4 2024 - Implementation of sharding capabilities by Q3 2025 - Soft audit of the sharded system in Q4 2025 Future Development (2025-2027) The final development phase focuses on implementing Xian (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) , the production-ready network that will bring Radix's scalability vision to fruition. This phase includes: - Production planning beginning in Q4 2025 - Full Rust implementation throughout 2026, encompassing: - Network infrastructure development - Database architecture - Cryptographic systems - Integration of Cassandra consensus - Implementation of the sharded Radix Engine The roadmap culminates in three major releases (https://www.radixdlt.com/blog/radix-labs-roadmap---to-hyperscale-and-beyond) : - Alpha Xian in early 2027 - Beta Xian in mid-2027 - Full production launch in the second half of 2027 This timeline represents a methodical approach to achieving Radix's vision of a globally scalable distributed ledger system capable of supporting billions of users while maintaining atomic composability and decentralization. ## Radit URL: https://radix.wiki/ecosystem/radit Updated: 2026-02-06 Summary: Radit.io was a community-owned investment message board that operated on the Radix network. It presented a unique concept where users can invest in messages and Radit.io was a community-owned investment message board that operated on the Radix network. It presented a unique concept where users can invest in messages and earn a yield based on their investments. The platform was known as the first community-owned investable message board powered by Radix. It was created by CaviarLabs (CaviarNine%20aeb1379af8a145209d11aba444339559.md) , a technology company focused on blockchain and decentralized finance (DeFi) solutions. https://youtu.be/3DDR41WD7p4 (https://youtu.be/3DDR41WD7p4) Operation Users on Radit.io can post new messages by sending them to a special gazette wallet along with any amount of Radix tokens ($XRD). The posted messages are owned, similar to how a Non-Fungible Token (NFT) is owned. When other users invest in these messages by sending $XRD, the original poster earns a yield. Every message investment gives the investor some ownership of that message, and they can earn $XRD whenever anyone else invests in a message they own a part of. As of mid-2023, Radit.io (https://www.radit.io) has seen an investment of 5249K $XRD, with 24,513 messages posted by 1,095 unique users. Governance Radit.io also has its own governance token, $RADIT. Owners of $RADIT tokens possess 30% ownership of every message posted on the platform and have a say in the future of Radit.io. The platform ensures that 100% of $XRD sent to Radit.io are distributed back as rewards at the end of each day, emphasizing its commitment to community ownership and participation. Notable Features One of the unique features of Radit.io is the concept of "100k XRD Message". When any message on the platform reaches an investment of 100k XRD, a message token gets minted on the Radix network. This newly minted token will have a supply of 100k units, which are distributed to the message owners proportionally to their percentage ownership at the point the message was worth 100k XRD. The token includes all the details of the original message and is immortalised onto the Radix Ledger. ## Radical Staking URL: https://radix.wiki/ecosystem/radical-staking Updated: 2026-02-06 Summary: Radical Staking is a validator on the Radical Network. Validators help secure the blockchain network by validating transactions and maintaining the integrity of Radical Staking is a validator on the Radical Network. Validators help secure the blockchain network by validating transactions and maintaining the integrity of the network. The Radical Staking validator specifically focuses on staking and contributes to the consensus mechanism of the network. This validator plays a crucial role in ensuring the security and decentralization of the network. Overview The Radical Staking (https://www.radicalstaking.com/) validator is a validator node on the Radix blockchain network. It has been operating since the Radix Olympia Betanet launch on April 28, 2021, and is known for being one of the most experienced, reliable, and trusted validator nodes in the Radix community. Radical Staking validator has achieved significant success, with over 82 million XRD delegated and 2,070 delegators so far. In an effort to promote diversity and decentralization, they also assist smaller validator nodes within the Radix ecosystem. If you hold any XRD or eXRD and want to stake or delegate your tokens, Radical Staking validator provides a high-performance validator node in a secure cloud environment. They offer 24x7x365 monitoring and are considered to be one of the top Radix validator nodes. Partnerships The Radical Staking validator has formed partnerships with several projects within the Radix ecosystem. XSEED XSEED (/ecosystem/xseed) is a decentralized accelerator and investment platform on the Radix network. Radical Staking validator has partnered with XSEED to support and foster the growth of new projects within the ecosystem. Ociswap Ociswap (/ecosystem/ociswap) is a decentralized exchange (DEX) built on Radix. Radical Staking validator collaborates with Ociswap to provide liquidity and support the trading of tokens on the platform. Radix Collection Radix Collection is a decentralized NFT (non-fungible token) platform on the Radix network. The partnership with Radical Staking validator aims to promote the adoption and use of NFTs within the Radix ecosystem. Delphibets Delphibets (/ecosystem/delphibets) is a decentralized prediction market platform built on Radix. Radical Staking validator joins forces with Delphibets to ensure the integrity and security of prediction market transactions. Cerby Cerby is a decentralized identity platform on Radix that enables users to have control over their personal data. Radical Staking validator supports the security and verification processes of Cerby. Mayas Protocol Mayas Protocol is a decentralized stablecoin platform on Radix. The partnership with Radical Staking validator contributes to the stablecoin ecosystem by maintaining and securing the network. These partnerships demonstrate the commitment of Radical Staking validator towards the growth and development of the Radix ecosystem by supporting various projects and initiatives on the network. Product The Radical Staking validator is primarily involved in staking services on the Radix blockchain network. As a validator, their main product is the secure and reliable operation of a node that participates in the consensus and validation process on the Radix network. By staking with Radical Staking validator, users can delegate their XRD (formerly known as eXRD) tokens and earn staking rewards, which are distributed based on the validator's performance and the amount of tokens delegated. In addition to staking services, Radical Staking validator also supports the growth and development of the Radix ecosystem through partnerships with various projects and initiatives within the network. They contribute to the security and stability of the network, which ultimately benefits all participants in the Radix ecosystem. Benefit Joining the Radical Staking node can provide several benefits to users who want to participate in the staking process on the Radix network. Here are some potential benefits of joining the Radical Staking node: Staking Rewards By delegating your XRD tokens to the Radical Staking node, you can earn staking rewards. Validators typically distribute rewards to their stakers based on factors like the number of tokens staked and the validator's performance. Passive Income Staking allows you to earn a passive income by simply holding and staking your tokens. Instead of keeping your tokens idle, staking them with Radical Staking can generate regular rewards, providing you with an additional income stream. Network Security By participating in staking with reputable validators like Radical Staking, you contribute to the security and decentralization of the Radix network. Validators play an essential role in securing the network by validating transactions and maintaining the integrity of the blockchain. Community Engagement Joining the Radical Staking node can provide opportunities to connect with other community members who are also staking their tokens. Engaging with the Radix community can lead to valuable insights, collaborations, and access to information about new projects and initiatives within the ecosystem. Support the Radix Ecosystem By staking with Radical Staking, you support the growth and development of the Radix ecosystem. Validators contribute to the stability and integrity of the network, which benefits all participants, including developers, users, and other stakeholders within the ecosystem. Security and Trust Radical Staking validator aims to provide a secure and trustworthy staking service. Here are a few points to consider regarding the security and trustworthiness of Radical Staking validator: Reputation Look for information about the reputation of Radical Staking validator within the Radix community. Check if they have a history of reliable operation and positive feedback from stakers. Transparency Validators that prioritize transparency often provide detailed information about their team, their infrastructure, and their security measures. Check if Radical Staking validator provides transparent information about their operation and any security measures they have in place. Experience and Expertise Evaluate the experience and expertise of the team behind Radical Staking validator. Look for any relevant expertise in blockchain technology, staking, and security. Consider if they have a track record of successfully operating validators on other networks. Governance Participation Validators that actively participate in the governance of the network can be seen as more trustworthy. Check if Radical Staking validator actively engages in network governance, contributing to the decision-making processes of the Radix network. Diversification of Nodes Validators that operate multiple nodes across different geographical locations improve the overall security and reliability of the network. It is worth checking if Radical Staking validator has a distributed infrastructure that reduces the risk of a single point of failure. Community Trust Research the sentiment of the Radix community toward Radical Staking validator. Look for comments, reviews, or endorsements from other community members to assess the level of trust others have in their services ## Proven Network URL: https://radix.wiki/ecosystem/proven-network Updated: 2026-02-06 Summary: Proven Network is a companion network designed to complement and extend the capabilities of the Radix distributed ledger technology (DLT). The network aims to e Proven Network (https://docs.proven.network/) is a companion network designed to complement and extend the capabilities of the Radix distributed ledger technology (DLT). The network aims to enable a new generation of Web3 decentralized applications (dApps) that can provide user experiences comparable to traditional Web2 applications while maintaining strong privacy and security guarantees. Overview The Proven Network provides three primary categories of services: off-ledger auditable compute, encrypted storage, and a global messaging infrastructure (https://docs.proven.network/) . The compute layer utilizes a WebAssembly (WASM)-based virtual machine that operates within trusted execution environments (TEEs), ensuring complete privacy and protection against tampering, even from hardware operators. This system employs a serverless model where developers only pay for actual usage, with the ability to scale to zero when inactive. Unlike many blockchain-adjacent services, Proven Network explicitly does not aim to be decentralized, permissionless, or censorship-resistant (https://docs.proven.network/decentralisation) , as these features are already provided by the underlying Radix DLT. Instead, it positions itself as an alternative to traditional cloud service providers like Cloudflare, Vercel, or AWS Lambda, but with additional cryptographic security guarantees and specialized tooling for Radix developers. The network's trust model (https://docs.proven.network/the-trust-model-and-cryptography) is built on three main pillars: verification of network node software, verification of application code, and verification of end-user identity. This model utilizes remote attestation and public build logs to ensure transparency and auditability of all system components. For user identity, the system integrates directly with Radix's native Persona system and uses ROLA (Radix OAuth-Like Authentication) for session management, eliminating the need for additional private key management. The platform supports various use cases including multiplayer game servers, oracle services, asset-based access control, scheduled transactions, GDPR-compliant applications, decentralized identity verification, private DAO voting mechanisms, and automated trading systems. These applications can leverage the network's infrastructure to maintain on-ledger asset management while performing complex computations off-ledger in a secure and verifiable manner. Core Components Off-ledger Compute Proven Network's compute layer (https://docs.proven.network/) operates on a WASM-based virtual machine that runs within trusted execution environments (TEEs). This architecture ensures that all computations remain private and tamper-proof, protecting against interference even from the hardware operators themselves. The system implements a serverless computing model where developers are charged only for actual usage, with the ability to scale to zero during periods of inactivity. The compute environment (https://docs.proven.network/coding-verifiable-components) utilizes the WebAssembly Component Model, allowing developers to write components that can be verified and executed within the network's secure environment. These components can be written in languages that compile to WebAssembly and interact with the system through WASIP2 interfaces, providing access to specialized components for storage, messaging, and Radix DLT interaction. Storage System The storage infrastructure (https://docs.proven.network/storage-options) comprises four distinct types of storage solutions, each designed for specific use cases. The key-value storage system supports UTF-8 string keys with arbitrary byte values up to 1MB, offering immediate consistency. Blob storage removes the size limitation but operates on an eventual consistency model. The platform also includes a distributed SQLite implementation for relational data storage and a specialized keychain storage for managing cryptographic material. These storage systems can operate in three distinct contexts: application, identity, and NFT. The application context (https://docs.proven.network/storage-options) provides global state storage accessible across all users, while the identity context ties data to specific user personas authenticated through ROLA. The NFT context binds storage access to token ownership, automatically transferring data access when the associated NFT changes hands. Global Messaging Infrastructure The messaging infrastructure (https://docs.proven.network/) serves as the connective tissue between the network's various components, linking the TEE-based compute layer, encrypted storage systems, and Radix-native on-ledger events. This system provides cryptographically-authenticated channels that extend to edge devices, enabling real-time trust-minimized compute and collaboration between users. The messaging system supports both HTTP-based components and direct integrations through the platform's SDK. HTTP endpoints (https://docs.proven.network/http-based-components) can be created to interface with third-party systems such as OAuth providers, with support for JWT-based authorization to maintain secure access to session-based storage resources. The infrastructure ensures that all communications maintain the same level of security and verifiability as the compute and storage components, creating a cohesive and secure environment for decentralized applications. Trust Model Network Verification The network's trust architecture (https://docs.proven.network/the-trust-model-and-cryptography) is built entirely on nodes operating within trusted execution environments. Developers and third parties can independently verify the network by running the build process, which produces measurements capturing the complete state of the application software, Linux kernel, and RAMFS disk state. These measurements are fundamental to the trust architecture, enabling all parties to verify and audit the network's mechanisms through remote attestation. Network nodes must demonstrate (https://docs.proven.network/the-trust-model-and-cryptography) to existing network participants that they are running identical code in the same configuration before they can become data replicas or execute computational workloads. The node software architecture treats all external services, including its parent host and infrastructure operator, as untrustworthy. This zero-trust approach requires all external communications to be cryptographically secured based on a root of trust embedded within the auditable codebase. Application Code Verification Application code verification (https://docs.proven.network/the-trust-model-and-cryptography) is handled through a comprehensive build process that occurs entirely within the TEE. Network nodes will only execute code that they have either built themselves or code that has been signed by verified peers. The build process targets the WebAssembly component model, with nodes providing virtualized capabilities to guest code through WASIP2 interfaces. To maintain transparency, the network publicly hosts (https://docs.proven.network/the-trust-model-and-cryptography) the complete build log and all source code inputs. Configuration settings for messaging, storage access, and external service interactions are permanently encoded during the build step. While private environment variables (such as API keys) can be used, their presence (though not their contents) is recorded in the auditable output. Application updates are permitted but invalidate existing user sessions, ensuring users must explicitly trust new versions before they can interact with identity-keyed state. User Identity Verification User identity verification (https://docs.proven.network/the-trust-model-and-cryptography) in Proven Network is uniquely integrated with the Radix DLT Persona system, eliminating the need for independent identity management. The system utilizes ROLA (Radix OAuth-Like Authentication) for session-level cryptographic bindings between users and applications. This integration creates a seamless developer and user experience without requiring additional private key management. The session creation process establishes a four-way binding between a user device-generated Ed25519 key, ROLA-originated proofs from their wallet, remote attestation of the TEE environment, and the current version of the application code and configuration. If a session signing key is lost, users can easily generate a new one by logging in again through the Radix Connect button, with ROLA proofs granting access to the same private storage resources when the same Radix Persona is used. Storage Types The storage architecture (https://docs.proven.network/storage-options) encompasses four distinct storage types, each designed for specific use cases. The key-value storage system functions similarly to Redis, requiring UTF-8 string keys while accepting arbitrary byte values up to 1MB, with immediate consistency guarantees. Blob storage removes the size limitation but operates on an eventual consistency model. The platform includes a distributed SQLite implementation for relational data storage, and a specialized keychain storage system for handling cryptographic material that can only be used indirectly through system-level components for ledger interactions. Storage Contexts Storage contexts (https://docs.proven.network/storage-options) define the accessibility and scope of stored data. The application context provides global state storage that remains consistent across all users and is available in all execution modes. The identity context binds data to specific user personas authenticated through ROLA, ensuring data privacy on a per-user basis. The NFT context ties storage access to token ownership, with data access automatically transferring to new owners upon NFT transfer. Both identity and NFT contexts require valid JWT authentication for HTTP endpoint access. Use Cases Proven Network enables (https://docs.proven.network/) a wide range of decentralized applications that require complex off-chain computation while maintaining secure integration with on-chain assets and identity. These applications leverage the network's combination of trusted execution environments, secure storage, and real-time messaging capabilities. In the gaming sector, developers can create multiplayer game servers (https://docs.proven.network/) with sophisticated rule sets computed in real-time off-ledger while managing in-game assets on-ledger. This architecture is particularly valuable for GameFi applications that require imperfect information between players or games whose rule sets are too complex to compute through consensus-based mechanisms alone. The platform supports oracle services (https://docs.proven.network/) that bridge the gap between on-chain and off-chain environments by leveraging existing internet PKI infrastructures. These oracles can securely pull information from web-based APIs with full audit logs, enabling applications such as prediction markets. The use of hardware-based trust models allows these oracles to operate significantly more cost-effectively than their crypto-economic counterparts. For content distribution and communication, applications can implement (https://docs.proven.network/) sophisticated access control systems that gate content such as songs, videos, or private messaging based on ledger state. This allows for token-gated content platforms where access rights are directly tied to asset ownership on the Radix ledger. In the realm of automation, the platform enables (https://docs.proven.network/) programmatic triggering of on-ledger transactions based on timers or cron schedules. Developers can also create trading bots and autonomous agents that react to on-ledger events, ranging from simple dollar-cost averaging tools to complex real-time DeFi trading strategies. For organizations requiring strong compliance measures, Proven Network supports (https://docs.proven.network/) applications that must adhere to global data regulations, with features ensuring requirements like the "Right to be Forgotten" are embedded in auditable code. The platform also facilitates the creation of trust-minimized DID (Decentralized Identity) issuers and verifiers that can serve as integration points for legacy or Web2 identity providers. In the context of decentralized governance, the network enables (https://docs.proven.network/) DAOs (Decentralized Autonomous Organizations) to implement private voting capabilities while maintaining the ability to enact voting results on-ledger. This includes the capability to deploy new smart contracts, enabling true decentralization of control over decentralized applications. For applications requiring advanced cryptographic operations, the platform can serve (https://docs.proven.network/) as a secure environment for computing zero-knowledge proofs for resource-constrained devices such as mobile phones, with the results being verifiable on-ledger. This capability bridges the performance gap between lightweight clients and computationally intensive cryptographic operations. These use cases demonstrate how Proven Network complements the Radix DLT by providing secure, scalable off-chain computation and storage while maintaining the security and verifiability guarantees expected in decentralized systems. Decentralization Status Proven Network explicitly defines itself as non-decentralized (https://docs.proven.network/decentralisation) , with decentralization, permissionlessness, and censorship resistance being stated non-goals of the project. This architectural decision reflects the network's design philosophy of complementing rather than competing with the Radix DLT, which already provides these characteristics for asset management and contract governance. Instead of pursuing decentralization, the platform positions itself (https://docs.proven.network/decentralisation) as an alternative to traditional cloud service providers such as Cloudflare, Vercel, AWS Lambda, and Google Cloud Platform's Cloud Functions for hosting off-ledger backend systems. The network's primary objective is to deliver comparable quality of service, performance, and developer experience to these centralized platforms while adding an additional layer of cryptographic security and verifiability to off-ledger computation. This design choice means that the platform can only provide (https://docs.proven.network/decentralisation) the same level of censorship resistance guarantees as other serverless alternatives. However, the network compensates for this limitation by ensuring that all off-ledger computation remains as verifiable and auditable as on-ledger computation, creating a transparent and trustworthy environment for application deployment. Future development plans (https://docs.proven.network/decentralisation) include research into a federated architecture that would allow application developers to self-host their own Proven nodes. These nodes would be scoped to specific applications and could function as either primary or failover infrastructure. However, this federation capability remains a future consideration, with current development focused on establishing and stabilizing the core network infrastructure. Development Tools Frontend Integration Proven Network provides (https://docs.proven.network/integrating-the-front-end-sdk) a frontend SDK called the Proven dApp Toolkit, which integrates with the existing RadixDappToolkit to enable developers to build decentralized applications. The toolkit is available as a Node.js package (@proven-network/proven-dapp-toolkit) and can be installed via standard package managers. The integration process requires modifications (https://docs.proven.network/integrating-the-front-end-sdk) to the standard RadixDappToolkit initialization, allowing applications to simultaneously access both Radix and Proven Network functionality. A key requirement for proper integration is the inclusion of proof requests for personas and, optionally, for accounts when making data requests through the wallet API. For backend development, the platform supports (https://docs.proven.network/coding-verifiable-components) the WebAssembly Component Model, allowing developers to create verifiable components that run within the network's trusted execution environments. These components can be written using languages that compile to WebAssembly and interact with the system through WASIP2 interfaces. HTTP-based components (https://docs.proven.network/http-based-components) can be created by implementing the wasi:http/proxy interface, enabling integration with third-party systems such as OAuth providers. These components support JWT-based authentication through the Authorization header, allowing secure access to session-based storage resources. Developers working with storage systems have access to multiple storage types (https://docs.proven.network/storage-options) through the platform's APIs, including key-value storage, blob storage, relational storage via distributed SQLite, and specialized keychain storage for cryptographic operations. Each storage type can be accessed through different contexts (application, identity, or NFT) depending on the application's requirements. The development environment emphasizes security and verifiability, with all application code (https://docs.proven.network/the-trust-model-and-cryptography) being built within trusted execution environments and verified through a public build process. This ensures that deployed applications maintain the same level of trust and auditability as the underlying network infrastructure. ## Project Elysium URL: https://radix.wiki/ecosystem/project-elysium Updated: 2026-02-06 Summary: Project Elysium is a decentralized Non-Fungible Token (NFT) project on Radix. The project aims to create a decentralized metaverse with a community-driven story Project Elysium is a decentralized Non-Fungible Token (NFT) project on Radix. The project aims to create a decentralized metaverse with a community-driven story. It is more than just an NFT project, as it seeks to create intellectual property that fans and the community can contribute to, share, and benefit from. The goal is to create Web 3.0's first truly decentralized and open-sourced science fantasy saga. Overview Project Elysium was announced on April 20, 2023, with the intention of creating a decentralized story for a decentralized metaverse. The project is based on the principles of Web 3.0, which it interprets not just as a technology, but also as an ideology and a model. The project believes in decentralization, community involvement and governance, and incentivization for the most accomplished and passionate. Initiatives The first initiative of Project Elysium is a story-rich graphic novel and a traditional Dungeons & Dragons (DnD) style Role-Playing Game (RPG). The details of these initiatives are expected to be released in the coming weeks. Philosophy Project Elysium believes that the Web 3.0 model can be applied to any company, idea, or industry, including the comic and tabletop RPG industry. These industries have a low barrier of entry and allow for the creation of intellectual property. The project aims to create a place where people are not just consumers and marketing tools, but active participants in the products and experiences they want to enjoy and build. Future Plans Project Elysium plans to release a revised website, whitepaper, and NFT details in the coming weeks. The project aims to provide community members with true freedom of digital asset ownership, participation, and a greatly enhanced user experience. ## Project $Now URL: https://radix.wiki/ecosystem/project-now Updated: 2026-02-06 Summary: Project $NOW is a specialized marketing agency focused on the Radix blockchain ecosystem. The company positions itself as a pioneer in organic content marketing Project $NOW is a  specialized marketing agency focused on the Radix blockchain ecosystem (https://projectnow.io/) . The company positions itself as a pioneer in organic content marketing for cryptocurrency projects, particularly those built on the Radix Chain.  Project $NOW distinguishes itself from traditional agencies by eschewing conventional approaches like flashy advertisements or expensive influencer partnerships (https://projectnow.io/) , instead concentrating on what they term "real powerful marketing." https://youtu.be/R920rS2fotU?si=e-VAP29ELviB8gdM (https://youtu.be/R920rS2fotU?si=e-VAP29ELviB8gdM) Introduction The agency offers a comprehensive suite of services designed to support and promote projects within the Radix ecosystem. These services include  social media management, branding, design, video production, and community engagement strategies (https://projectnow.io/services/) . Project $NOW emphasizes its expertise in creating and managing organic content marketing campaigns, leveraging its network and experience to elevate projects in the competitive cryptocurrency space. A key feature of Project $NOW's business model is its  native cryptocurrency token, $NOW (https://projectnow.io/token/) . This token serves multiple purposes within the Project $NOW ecosystem, including providing utility for accessing services and participating in the platform's economy. The company commits to  reinvesting 10% of its revenue back into the $NOW token or allocating it for active marketing campaigns (https://projectnow.io/) , aiming to create a sustainable and growing ecosystem. Project $NOW also collaborates with the  Radix Community Council to offer marketing grants to selected projects on the Radix Network (https://projectnow.io/rcc/) , further cementing its role as a central player in the Radix ecosystem's growth and development. As of 2024, Project $NOW continues to expand its services and partnerships within the Radix blockchain community, positioning itself as a crucial resource for projects seeking to establish and grow their presence in the rapidly evolving world of cryptocurrency and decentralized finance (DeFi). Background The specific founding date and detailed history of Project $NOW are not explicitly stated in the available information. However, the company's mission and vision are clearly articulated through its various services and approaches to marketing in the cryptocurrency space. Project $NOW was established with the primary goal of rewriting the rules of cryptocurrency marketing, specifically for projects on the Radix Chain (https://projectnow.io/) . The company's founders recognized a gap in the market for effective, organic marketing strategies tailored to the unique needs of blockchain and cryptocurrency projects. The vision of Project $NOW is rooted in the belief that traditional marketing approaches, such as flashy advertisements and expensive influencer deals, are not the most effective ways to promote cryptocurrency projects. Instead, the company focuses on pioneering organic content marketing strategies (https://projectnow.io/) , leveraging its expertise and network to help projects grow and thrive in the dynamic world of cryptocurrency. Project $NOW's approach is built on a foundation of over 20 years of combined experience in the field (https://projectnow.io/) , although the specific backgrounds of the founding team members are not detailed in the available information. This extensive experience informs their strategy of creating meaningful, engaging content that resonates with the cryptocurrency community. The company's mission extends beyond mere promotion. Project $NOW aims to be a central hub for Radix projects, fostering community engagement and facilitating growth within the ecosystem (https://projectnow.io/) . This is evidenced by their development of the $NOW token, which serves as both a utility token for accessing services and a means of community engagement through features like giveaways and staking rewards. Furthermore, Project $NOW demonstrates a commitment to the broader Radix ecosystem through its partnership with the Radix Community Council to provide marketing grants to promising projects (https://projectnow.io/rcc/) . This initiative underscores the company's dedication to nurturing the growth of the entire Radix network, not just individual clients. In summary, while the specific historical details of Project $NOW's founding are not available, the company's background is characterized by a strong vision for revolutionizing cryptocurrency marketing, a wealth of industry experience, and a commitment to fostering growth within the Radix ecosystem. Services Project $NOW offers a comprehensive suite of marketing and promotional services tailored specifically for projects on the Radix blockchain. These services are designed to help cryptocurrency projects gain visibility, engage with their community, and grow their presence in the competitive digital asset space. 3.1 Marketing Services Project $NOW provides a range of specialized marketing services (https://projectnow.io/services/) for cryptocurrency projects: - Video Production: The agency brings projects to life through captivating visuals and storytelling (https://projectnow.io/services/) . This service includes creating promotional videos, explainer content, and other video materials to help projects communicate their value proposition effectively. - Social Media Management: Project $NOW develops and implements social media strategies to enhance visibility for new projects on the Radix Chain (https://projectnow.io/services/) . This service includes content creation, community management, and strategic posting across various social media platforms. - Branding: The company offers branding services to help elevate new projects, creating appealing visual identities (https://projectnow.io/services/) . This service aims to create unique brand identities that capture the essence of each project and resonate with its target audience. - Design: Specializing in design for Radix crypto projects (https://projectnow.io/services/) , Project $NOW crafts visuals that engage communities. This service covers various design needs, from logos and graphics to user interface elements for decentralized applications (dApps). - Giveaway Campaigns: To boost engagement, Project $NOW offers a Giveaway Campaign Service, expertly crafting hype around projects (https://projectnow.io/services/) . This service helps projects attract attention and grow their community through strategic giveaways. - BuyBOT and Twitter Alerts: The agency provides a Radix BuyBOT and Twitter service to elevate hype by instantly alerting a project's community (https://projectnow.io/services/) about important updates or trading activity. 3.2 Platform Features In addition to its marketing services, Project $NOW has developed a platform with several features designed to support projects and engage the Radix community: - Central Hub for Radix Projects: The platform serves as a focal point for projects built on the Radix blockchain (https://projectnow.io/) , providing a space for discovery and interaction. - Community Engagement Tools: Project $NOW offers tools and strategies to help projects build and maintain active communities around their tokens or applications. - Giveaway Management: The platform includes features for managing giveaways, which can help projects boost engagement and attract new users (https://projectnow.io/) . - $NOW Token Integration: The platform leverages the $NOW token to provide additional benefits and engagement opportunities for users and projects alike. 3.3 Additional Services Project $NOW also offers specialized services in collaboration with other entities: - Marketing Grants: In partnership with the Radix Community Council, Project $NOW provides marketing grants to selected projects on the Radix Network (https://projectnow.io/rcc/) . This initiative aims to support and promote promising projects within the ecosystem. - Token Trek Services: For projects listed on Token Trek, Project $NOW offers management services, including quest creation and thumbnail design (https://projectnow.io/rcc/) , to help projects maximize their visibility on the platform. $NOW Token The $NOW token is an integral part of the Project $NOW ecosystem, serving multiple purposes within the platform and for the broader Radix community. 4.1 Overview The $NOW token is described as "the best-designed coin on the Radix Ecosystem" (https://projectnow.io/token/) by Project $NOW. While this claim is subjective, it highlights the emphasis the project places on the token's design and utility. Key features of the $NOW token include: - Utility: The $NOW token serves as a key to unlocking marketing potential for projects on the Radix Chain (https://projectnow.io/) . - Platform Integration: The token is integrated into the Project $NOW platform, which serves as a central hub for Radix projects, community engagement, and giveaways (https://projectnow.io/) . Token holders may have preferential access or additional benefits within this ecosystem. - Reward Mechanism: The platform implements a "hold-to-win" mechanism, where the more $NOW tokens a user holds, the more they can potentially win (https://projectnow.io/) in platform-related activities or giveaways. 4.2 Token Economics The $NOW token incorporates several economic mechanisms designed to support its value and utility: - Revenue Sharing: Project $NOW commits to reinvesting 10% of the revenue from the $NOW agency back into the coin or allocating it for active marketing campaigns (https://projectnow.io/) . This mechanism aims to create a positive feedback loop between the success of the agency and the value of the token. - Staking: The $NOW token can be staked on DefiPlaza, a decentralized finance platform on the Radix network (https://projectnow.io/token/) . Staking provides additional benefits to token holders: - A distribution of 350,000 NOW tokens is made each week to stakers. - Stakers also receive a share of "Bonus Box Rewards," although the specifics of these rewards are not detailed in the available information. 4.3 Acquisition and Trading The $NOW token is primarily available through the Radix ecosystem: - Purchase Process: Users can buy $NOW tokens on OCISwap, a decentralized exchange on the Radix network (https://projectnow.io/token/) . - Wallet Compatibility: The token is compatible with the official Radix wallet, which users need to set up before purchasing $NOW (https://projectnow.io/token/) . 5. Partnerships and Collaborations Project $NOW has established various partnerships and collaborations within the Radix ecosystem, demonstrating its commitment to fostering growth and innovation in the blockchain space. 5.1 Partner Projects Project $NOW's portfolio (https://projectnow.io/services/) showcases a range of partner projects, indicating the diversity of its client base within the Radix ecosystem. Some of the notable partner projects include: - $HUG - NFTwars - $STAB - $DFP - $WEFT - $ASTRL - $WOWO - $PHOENIX - $HIT - $FOMO 5.2 Radix Community Council Collaboration One of the most significant collaborations for Project $NOW is its partnership with the Radix Community Council. This collaboration focuses on providing marketing support to projects within the Radix ecosystem: - Marketing Grants Program: Project $NOW and the Radix Community Council have partnered to offer marketing grants to selected Radix projects (https://projectnow.io/rcc/) . This initiative aims to support and promote promising projects within the Radix ecosystem. - Grant Structure: At the end of each month, the Radix Community Council sponsors marketing grants to four Radix projects (https://projectnow.io/rcc/) . The grants are distributed as follows: - 1 grant for a meme project. - 1 grant for an NFT project. - 2 grants for Radix projects from various categories. - Grant Benefits: Each selected project receives 25 hours of marketing services from Project $NOW (https://projectnow.io/rcc/) . These services can include: - Telegram engagement strategies. - UI/UX design. - Social media campaigns and design. - Token Trek services (management of project listings on the Token Trek platform). 5.3 Technology Partnerships While not explicitly detailed in the available information, Project $NOW's services and token ecosystem suggest implicit technological partnerships or integrations: - OCISwap: The $NOW token is available for purchase on OCISwap (https://projectnow.io/token/) , indicating a partnership or at least a working relationship with this decentralized exchange on the Radix network. - DefiPlaza: The $NOW token can be staked on DefiPlaza (https://projectnow.io/token/) , suggesting a collaboration or integration with this DeFi platform. Business Model Project $NOW operates with a multifaceted business model that combines traditional agency services with innovative tokenomics and community-driven initiatives. This approach allows them to generate revenue while also fostering growth within the Radix ecosystem. 6.1 Agency Services The primary source of revenue for Project $NOW appears to be its comprehensive suite of marketing and promotional services (https://projectnow.io/services/) tailored for projects on the Radix blockchain. These services include: - Video production. - Social media management. - Branding. - Design. - Giveaway campaigns. - BuyBOT and Twitter alerts. 6.2 Revenue Sharing A unique aspect of Project $NOW's business model is its commitment to reinvesting a portion of its revenue back into its ecosystem: - 10% of the revenue from the $NOW agency is reinvested into the $NOW coin or allocated for active marketing campaigns (https://projectnow.io/) . This approach serves multiple purposes: - It potentially increases the value of the $NOW token, benefiting token holders. - It provides funding for marketing initiatives, which could attract more clients and partners to the ecosystem. - It demonstrates a long-term commitment to the growth of the Project $NOW ecosystem. 6.3 Token Economy The $NOW token plays a significant role in the company's business model: - Utility: The token serves as a key to unlocking marketing potential for projects on the Radix Chain (https://projectnow.io/) , suggesting that holding or spending $NOW tokens might be required to access certain services or receive preferential treatment. - Staking Rewards: Users can stake $NOW tokens on DefiPlaza to earn additional tokens and other rewards (https://projectnow.io/token/) . This mechanism encourages users to hold $NOW tokens, potentially increasing demand and value. - Platform Integration: The more $NOW tokens a user holds, the more they can potentially win in platform-related activities or giveaways (https://projectnow.io/) . This incentivizes users to acquire and hold $NOW tokens. 6.4 Community Grants While not a direct revenue generator, Project $NOW's collaboration with the Radix Community Council to provide marketing grants (https://projectnow.io/rcc/) is an important part of its business model: - This initiative helps Project $NOW establish itself as a key player in the Radix ecosystem. - It provides exposure to new projects that may become paying clients in the future. - It demonstrates Project $NOW's expertise and capabilities to the broader Radix community. 6.5 Partnerships Project $NOW's partnerships with various projects in the Radix ecosystem (https://projectnow.io/services/) contribute to its business model, potentially through: - Revenue from services provided to partner projects. - Cross-promotion opportunities. - Increased visibility and credibility within the Radix ecosystem. Project $NOW's business model combines traditional agency services with innovative blockchain-based elements, creating a unique value proposition in the cryptocurrency marketing space. By aligning its success with that of its token holders and the broader Radix ecosystem, Project $NOW aims to create a sustainable and growing business within the blockchain industry. ## Ploughshare URL: https://radix.wiki/ecosystem/ploughshare Updated: 2026-02-06 Summary: Ploughshare, previously known as CowDAO, is a project based on the SRWA protocol aimed at providing alternative finance solutions to farmers in New Zealand amid Ploughshare, previously known as CowDAO, is a project based on the SRWA (SRWA%205242ca7f4e524ac4be426af45ebaf70a.md) protocol aimed at providing alternative finance solutions to farmers in New Zealand amidst the decreasing bank support in the AgTech sector. Overview Ploughshare focuses on addressing the finance challenges faced by farmers in New Zealand due to banks' tightening terms and dwindling support in the AgTech space. By leveraging the Radix network, the SRWA protocol, and a network with proven real-world traction, Ploughshare positions itself as a viable alternative for farmers seeking financial support. Ploughshare Demo | www.srwa.io (https://youtu.be/Q4i78_dr1ug?si=W8_0q0F-0zfwFYop) | Nov 2023 Ploughshare Demo | www.srwa.io (http://www.srwa.io) | Nov 2023 Sustainability The project is deeply rooted in principles of sustainability and has plans to transition to a decentralized, DAO governance model. By partnering with subject matter experts, Ploughshare looks into the potential of acquiring orphan technologies and ensuring sustainability, focusing specifically on organic, non-GMO, pasture-based animal agriculture. Lending and Borrowing Initially, Ploughshare aims to introduce a straightforward DeFi style of lending and borrowing. However, it plans to diversify into traditional finance models over time. There is an ongoing exploration of a more extended term, mortgage-based loan offering with fixed interest rates and monthly installments, given its familiarity among most of Ploughshare's clientele. Ploughshare Token ($MOO) The Ploughshare token, represented as $MOO, plays a multifaceted role in the project: - Fundraising: During the token sale. - Reward System: For contributions to the project and for utilizing specific project features. - Governance: Serving as voting power within the DAO for project decision-making. Detailed information about the Ploughshare Tokenomics remains in the development phase. Regular updates can be found in the Tokenomics section. Early-stage Funding Ploughshare has already secured its early-stage funding, which is expected to catalyze its growth via investment rounds and the sale of Ploughshare tokens. Token Supply The tokenomics of Ploughshare remains a work in progress. Preliminary models suggest a fixed supply approach, but this may undergo changes leading up to the private sale investment round. ## Parabox URL: https://radix.wiki/ecosystem/parabox Updated: 2026-02-06 Summary: Parabox was the first metaverse built on the Radix Ledger but has since been disbanded. Developed by Parabox Studios, the team were on a quest to deliver the id Parabox was the first metaverse built on the Radix Ledger but has since been disbanded. Developed by Parabox Studios, the team were on a quest to deliver the ideal metaverse experience (https://www.esports.net/news/paradox-metaverse-announced/) . https://youtu.be/Mcq1bjEZOgc?si=6OitQ7jfzODV0aJL (https://youtu.be/Mcq1bjEZOgc?si=6OitQ7jfzODV0aJL) Overview Parabox featured a traversable virtual world that is free to play for everyone. Users could leverage their digital assets in a 3D environment, fostering an ecosystem of gamified experiences. To begin the Parabox journey, 10,000 unique Genesis Avatars were minted into existence. These Genesis Avatars were the first playable NFT avatars of Parabox. Vision Parabox aimed to be the first self-sustaining Web3 virtual world ecosystem that is owned, operated, and built upon by its community. The infrastructure of Parabox is powered by Distributed Ledger Technology (DLT), which allows users to have genuine autonomy over their digital assets and experiences. The virtual world of Parabox aimed to avoid the threat of a centralized entity or single-point failure that could hinder the user experience. Genesis Avatars Parabox will mint 10,000 unique Genesis Avatars into existence at the beginning of its journey. Commonly abbreviated as GAT, these are unique 3D voxel avatars that will exist within a Smart Contract of 10,000 uniquely identifiable DLT-backed tokens called NFTs. They are the first 10,000 playable Avatar NFTs on the Radix Ledger and will be playable within all the experiences of the original Parabox virtual world application. To reserve a Genesis Avatar, 300 $XRD (Radix's cryptocurrency) must be sent to a specified address. In the pre-Babylon phase, the Parabox community is reserving GAT in exchange for fungible placeholder tokens called GENAV. Post-Babylon, the GENAV tokens will be redeemable for GAT NFTs. The distribution ratio of GAT is 80% to the Parabox community, which initially purchased GAT in a public sale, and 20% to developers of the Parabox ecosystem. $PARA $PARA is a gamified currency token in Parabox's in-world economy, allowing the Parabox virtual society to exchange what they think is of equal value. It serves as a unified DAO treasury token across multiple Parabox DAOs and may be used for trading, gamified prize pooling, art galleries, events, and more. The initial distribution ratio of PARA was: 80% to each initial GAT holder, 0.5% to Ociswap DEX with 50k in $XRD as liquidity, and just under 40% to developers of the Parabox ecosystem. ## Nest Finance URL: https://radix.wiki/ecosystem/nest-finance Updated: 2026-02-06 Summary: Nest Finance is a first-of-its-kind DeFi protocol that unifies Lending, Liquidity, and Leverage into a single, secure DeFi product suite. On Nest Finance, users Nest Finance is a first-of-its-kind DeFi protocol that unifies Lending, Liquidity, and Leverage into a single, secure DeFi product suite. On Nest Finance, users can: Overview Nest Finance offers a suite of products that combine a variety of DeFi primitives to power sophisticated strategies - each wrapped into an accessible, user-centric interface. Nest Finance was originally created to offer users the easiest possible way of providing liquidity and earning yield on-chain. The protocol's one-click, auto-compounding concentrated liquidity. Borrow and lend their assets Provide leveraged liquidity to concentrated liquidity DEXs Build their own automated liquidity strategies Use concentrated liquidity positions as collateral Nest Finance product suite is packaged into an industry-leading UX that offers transparent analytics, detailed performance data, and extensive position info. Our Products Borrow & Lend: Earn Interest & Borrow Multiply Vaults: Boosted XRD Yield Long/Short Vaults: Increase Your Exposure Liquidity Vaults: Automated LP Strategies DIY Vault Creator: Custom LP Strategies ## Nerds Republic URL: https://radix.wiki/ecosystem/nerds-republic Updated: 2026-02-06 Summary: Nerds Republic is an NFT project launched by the team behind Foton. The collection consists of 10,000 unique characters. Launched as the first NFT initiative on Nerds Republic is an NFT project launched by the team behind Foton (Foton%209ea54c017dc24538bbfd95942ab135e6.md) . The collection consists of 10,000 unique characters. Launched as the first NFT initiative on Radix, each Nerd in the collection represents a unique identity, granting its owner full access within the Republic. The project aims to create a virtual space for the exploration of habitable planets and the research of alien life forms. https://youtu.be/QsKNRHfkX7A?si=rOU1Y19zWoKQxDcN (https://youtu.be/QsKNRHfkX7A?si=rOU1Y19zWoKQxDcN) Concept Set in a fictional universe, the base for these digital characters is the 'TinCan', a space station orbiting Earth. The overarching goal of the Republic is to search for habitable planets and study extraterrestrial life, emphasizing the need for skilled and unique individuals. Features - Artwork as Token: The NFTs are implemented as SVG files, ensuring that the artwork is compact and directly stored on the Radix Ledger. This feature guarantees that the artwork is inseparable from its proof of authenticity and allows for interactive elements or animations. - High-Resolution Graphics: Regardless of the display size, the artworks retain their sharpness and accuracy, thanks to vector graphics. - Commercial Rights: Owners have the liberty to use their Nerd artwork commercially, such as for merchandise. - Efficient Transactions: Hosted on the Radix Ledger, these NFTs benefit from low transaction fees and high energy efficiency. Tokenomics Each Nerd initially comes with a 10k allowance of Foton tokens, the Republic's currency. These tokens can be used for various activities within the Republic, like trading, supporting research, or even teaching newcomers about cryptocurrency. The total supply of Foton tokens is set at 1 Billion. Reservation and Distribution Starting from August 7, 2021, Nerds could be reserved by sending $XRD to the Republic's wallet. Each Nerd was priced at 150 $XRD. The project employed a system of Genesis Nerd Tokens ($GNRD), which were later swappable for the final NFTs upon the launch of smart contracts on Radix. Roadmap The Nerds Republic has a dynamic roadmap, with milestones based on the percentage of Nerds reserved. Achievements include the introduction of alien species, the opening of a Mars base, and the eventual goal of transforming the Republic into a Metaverse game. Further ambitions involve the creation of a platform for NFTs on Radix and even the production of a movie. ## Mox Studio URL: https://radix.wiki/ecosystem/mox-studio Updated: 2026-02-06 Summary: - Designer & Product Manager. - Senior Software Test Engineer. - App Support & Test Specialist. - Web Developer. 💰 Assets:: https://moxstudio.net/about/ (https://moxstudio.net/about/) - Designer & Product Manager. https://moxstudio.net/about/ (https://moxstudio.net/about/) - Senior Software Test Engineer. https://moxstudio.net/about/ (https://moxstudio.net/about/) - App Support & Test Specialist. https://moxstudio.net/about/ (https://moxstudio.net/about/) - Web Developer. 💰 Assets:: https://ociswap.com/resource_rdx1thmjcqjnlfm56v7k5g2szfrc44jn22x8tjh7xyczjpswmsnasjl5l9 (https://ociswap.com/resource_rdx1thmjcqjnlfm56v7k5g2szfrc44jn22x8tjh7xyczjpswmsnasjl5l9) 👋 Open Positions:: Scrypto (Rust) Developer ( https://www.notion.so/Scrypto-Rust-Developer-16bdfc2a7e3d807b9cd0c99a594899f0?pvs=21 (https://www.notion.so/Scrypto-Rust-Developer-16bdfc2a7e3d807b9cd0c99a594899f0?pvs=21) ), Senior Godot Developer ( https://www.notion.so/Senior-Godot-Developer-16bdfc2a7e3d8051a2d0f28513667f5a?pvs=21 (https://www.notion.so/Senior-Godot-Developer-16bdfc2a7e3d8051a2d0f28513667f5a?pvs=21) ), 3D Artist ( https://www.notion.so/3D-Artist-16bdfc2a7e3d8058bb65dea8121f36fd?pvs=21 (https://www.notion.so/3D-Artist-16bdfc2a7e3d8058bb65dea8121f36fd?pvs=21) ) Status: 🟡 Mox Studio (https://moxstudio.net/) is a gaming company specializing in Play-to-Earn (P2E) mobile and web-based games built on the Radix distributed ledger technology (DLT) platform. Founded in 2023, the company develops games that integrate cryptocurrency rewards through their native MOX token (https://moxstudio.net/token/) , allowing players to earn digital assets while playing. https://youtu.be/d8IPOdP7SoM (https://youtu.be/d8IPOdP7SoM) Overview Mox Studio emerged from its founders' background in digital product management, marketing, and IT, with over 15 years of corporate experience (https://moxstudio.net/about/) . Initially entering the blockchain industry in 2017, the team transitioned through various phases of crypto industry involvement before establishing Mox Studio. The company's primary focus is on creating engaging gaming experiences that showcase Radix DLT capabilities to a broader audience. Mox Studio's business model combines traditional gaming elements with blockchain technology, offering players the opportunity to earn cryptocurrency through gameplay. Their games are distributed globally through major platforms (https://moxstudio.net/) including the App Store and Google Play Store, making blockchain gaming accessible to mainstream audiences. The studio distinguishes itself through its integration with the Radix ecosystem (https://moxstudio.net/about/) , featuring mascots and tokens from partner projects within their games. This approach not only enhances the gaming experience but also promotes wider adoption of the Radix platform. Their flagship game, Strangers (https://moxstudio.net/games/strangers/) , exemplifies this integration, featuring both traditional gaming elements and blockchain-based rewards. Headquartered in Dalaman, Muğla (https://www.linkedin.com/company/moxstudio/) , Turkey, the company operates with a team of 2-10 employees, focusing on developing innovative gaming experiences that bridge the gap between traditional gaming and blockchain technology. Their mission emphasizes the creation of high-quality games that are both entertaining and financially rewarding for players. The studio's development approach prioritizes player engagement and technological innovation, demonstrated through their diverse portfolio of games including WOWO:Adventure, OciGravity, and Ociswap Hooks. Each game incorporates unique gameplay mechanics while maintaining the core principle of Play-to-Earn, allowing players to participate in the gaming economy through the MOX token system. History The development of Mox Studio evolved through several distinct phases, beginning well before its official establishment as a company. Early Blockchain Period (2017) The company's journey began in 2017 (https://moxstudio.net/about/) when its founding team first entered the blockchain industry. This initial exploration was supported by over 15 years of prior experience in digital product management, marketing, and information technology sectors. During this period, the team focused on understanding blockchain technology and its potential applications in digital products. Expansion into Crypto Services (2018-2020) In 2018, the team began providing design consultancy services (https://moxstudio.net/about/) to various business associates within the cryptocurrency industry. Their work during this period focused on helping projects elevate their brands and establish stronger market presence. By 2019, they had expanded their service offerings to include animation and motion design, building a comprehensive portfolio of crypto-related digital services. Gaming Integration (2021-2022) A significant pivot occurred in 2021 (https://moxstudio.net/about/) when the team recognized the transformative potential of gaming within the crypto sector. This period marked a crucial transition as they began integrating their accumulated design expertise and development capabilities to create gaming experiences that could engage broader audiences. The focus shifted specifically toward developing games that could showcase blockchain technology's potential in interactive entertainment. Company Formation (2023) Mox Studio was officially established in 2023 (https://www.linkedin.com/company/moxstudio/) , marking the culmination of years of experience in blockchain, design, and gaming. The company was founded with a clear mission to develop Play-to-Earn (P2E) games that would integrate seamlessly with the Radix DLT ecosystem. This period saw the launch of their first major projects, including their flagship game Strangers. Recent Developments (2023-Present) Since its formal establishment, Mox Studio has continued to expand its game portfolio (https://moxstudio.net/games/) and strengthen its position within the Radix ecosystem. Notable developments include: - The launch of multiple games including WOWO:Adventure, OciGravity, and Ociswap Hooks. - The introduction of the MOX token (https://moxstudio.net/token/) as a central element of their gaming ecosystem. - Expansion into telegram-based gaming with titles like HUG Monsters and Radix Fusion (https://www.linkedin.com/company/moxstudio/) . - Development of partnerships within the Radix ecosystem to integrate various mascots and tokens into their games. The company maintains its headquarters in Dalaman, Muğla, Turkey, operating with a focused team of professionals dedicated to advancing their vision of blockchain-integrated gaming experiences. Their ongoing development efforts continue to push boundaries in combining traditional gaming elements with blockchain technology, setting new standards for Play-to-Earn gaming experiences. Games Portfolio Strangers Strangers (https://moxstudio.net/games/strangers/) serves as Mox Studio's flagship game, offering a survival-based gaming experience where players find themselves stranded on an alien planet following a spacecraft malfunction. The game features: - Support for 13 languages including English, French, Chinese, Japanese, Korean, Russian, Polish, Spanish, Portuguese, German, Turkish, and Italian. - A roster of 44 unique characters with distinct abilities. - An arsenal of weapons allowing players to equip up to 6 simultaneously. - 177 different items for skill enhancement and gameplay customization. - Wave-based combat system with challenges lasting 20-90 seconds. - Four-tier upgrade system for character progression. - Integration with MOX token rewards system. WOWO: Adventure WOWO: Adventure (https://moxstudio.net/games/wowo-adventure/) is a web-based pixel art platformer that follows the journey of WOWO, a character attempting to save their cursed village. The game represents a collaboration with the WOWO project, which aims to promote Radix as a mainstream blockchain platform. Players progress through levels while activating various skills, combining nostalgic gameplay elements with modern blockchain integration. OciGravity OciGravity (https://moxstudio.net/games/ocigravity/) features Ocicat, the mascot of the Ociswap decentralized exchange (DEX) on Radix DLT. This pixel art platform game incorporates unique gravitational mechanics and puzzle elements. Players use one-button controls to navigate through levels while competing for positions on a global leaderboard. Ociswap Hooks Ociswap Hooks (https://moxstudio.net/games/ociswap-hooks/) is a pixel art puzzle platformer featuring the Ocicat character. The game emphasizes hook-based navigation mechanics and includes competitive leaderboard features. It serves as another collaboration with the Ociswap DEX platform. Recent Additions Recent additions to the game portfolio include telegram-based games (https://www.linkedin.com/company/moxstudio/) such as: - HUG Monsters: A bouncing-style game incorporating $HUG and $MOX token rewards. - Radix Fusion: A puzzle game where players combine cryptocurrency tokens based on market capitalization. - Mings: A multiplayer game supporting up to 16 players simultaneously, inspired by Flappy Bird and Mario Kart. MOX Token The MOX token (https://moxstudio.net/token/) serves as the primary cryptocurrency within Mox Studio's gaming ecosystem. Key features include: Token Utility - Used for in-game purchases. - Unlocking characters and special features. - Accessing game customization options. - Reward distribution for gameplay achievements. Token Distribution The total supply of MOX tokens is fixed at 100,000,000, allocated as follows: - 35% for Marketing, Team, and Development. - 20% for P2E Rewards. - 20% for Staking Rewards. - 15% for Presale and Liquidity. - 10% for Founders. Token Integration MOX tokens can be earned through gameplay (https://moxstudio.net/token/) or purchased on supported cryptocurrency exchanges. The token system is designed to integrate with the Radix DLT platform, though some features await completion of Radix's deep linking development. Development Services Mox Studio offers game development services (https://moxstudio.net/service/) structured around four key stages: - Conceptualization: Creating unique game concepts aligned with project vision. - Design: Developing detailed characters, environments, and game elements. - Development: Implementation using advanced technologies for mobile and web platforms. - Testing and Launch: Quality assurance and post-launch support. The company emphasizes customer-centric creativity and maintains a focus on fast, high-quality solutions while leveraging their expertise in both the gaming and crypto industries. Their development approach prioritizes end-to-end solutions and collaborative partnerships with clients. Technology Radix Integration Mox Studio's technology stack is built primarily around the Radix Distributed Ledger Technology (DLT) (https://moxstudio.net/) platform. The integration of Radix DLT serves as the foundation for their Play-to-Earn (P2E) ecosystem, enabling cryptocurrency transactions and reward distributions within their games. This technological choice aligns with their mission to showcase Radix DLT capabilities to a broader gaming audience. Gaming Platforms The company develops games for multiple platforms, implementing a cross-platform strategy that includes: Mobile Development Mobile applications (https://moxstudio.net/) are distributed through major platforms: - iOS applications through the Apple App Store. - Android applications through Google Play Store. Web-Based Gaming Web-based games (https://moxstudio.net/games/) are developed to be accessible through standard web browsers, requiring no additional software installation. The studio utilizes modern web technologies to ensure consistent performance and accessibility across different devices and platforms. Telegram Integration Recent technological expansions include the development of telegram-based gaming platforms (https://www.linkedin.com/company/moxstudio/) , enabling multiplayer functionality and social integration. These implementations support features such as: - Simultaneous multiplayer gaming for up to 16 players. - Real-time leaderboard updates. - Integrated chat functionality. - Token reward distribution systems. P2E Implementation The Play-to-Earn mechanics (https://moxstudio.net/token/) are implemented through a sophisticated system that integrates several technological components: Token Integration The MOX token system is integrated into the gaming platform through: - Smart contract implementation for token distribution. - Automated reward systems based on player achievements. - Integration with various cryptocurrency exchanges. - Staking mechanisms through platforms like DefiPlaza. Wallet Integration Games are designed to integrate seamlessly with the Radix wallet (https://www.linkedin.com/company/moxstudio/) , facilitating: - Direct token transfers. - In-game purchases. - Reward collection. - Asset management. Development Infrastructure The studio's development process (https://moxstudio.net/service/) utilizes a comprehensive technological framework that includes: Game Development Tools - Advanced technologies for both mobile and web platform development. - Pixel art development tools for retro-style games. - Physics engines for games like OciGravity. - Multiplayer server infrastructure for social gaming experiences. Quality Assurance Systems The company employs rigorous testing protocols including: - Performance optimization tools. - Server load testing systems. - Cross-platform compatibility testing. - Network stability monitoring. Security Implementation Security measures are implemented across various aspects of the platform: - Blockchain transaction security. - User data protection. - Anti-cheat systems for competitive gaming. - Secure reward distribution mechanisms. Future Development The company continues to expand its technological capabilities (https://moxstudio.net/about/) , with ongoing development in areas such as: - Deep linking integration with Radix. - Enhanced multiplayer functionality. - Improved cross-platform compatibility. - Advanced token utility implementations. The technology stack is continuously updated to maintain compatibility with the latest developments in both gaming and blockchain technologies, ensuring the platform remains current and competitive in the rapidly evolving blockchain gaming market. Business Model Play-to-Earn Framework Mox Studio's primary business model centers around the integration of Play-to-Earn (P2E) mechanics within their gaming ecosystem. The company has developed (https://moxstudio.net/) a system where players can earn MOX tokens through gameplay achievements and activities. This approach creates a dual-value proposition where users both enjoy entertainment through gaming and potentially earn cryptocurrency rewards. The P2E framework is implemented through various mechanisms across their game portfolio. In their flagship game Strangers (https://moxstudio.net/games/strangers/) , players can earn MOX tokens through regular gameplay, which can then be used to unlock additional characters, access special features, or customize their gaming experience. This creates a circular economy within the game ecosystem, encouraging sustained player engagement and investment in the platform. Revenue Streams The company generates revenue through multiple channels. Their games incorporate (https://moxstudio.net/token/) various monetization strategies, including in-game purchases using MOX tokens. The token system serves as both a reward mechanism for players and a revenue generator for the company, with tokens being available for purchase on supported cryptocurrency exchanges. Additionally, Mox Studio offers comprehensive game development services (https://moxstudio.net/service/) to clients, providing end-to-end solutions from conceptualization through launch and ongoing support. This service-based revenue stream complements their direct gaming revenue, creating a diversified business model that supports sustained growth and development. Distribution Strategy The company employs a multi-platform distribution strategy to maximize reach and accessibility. Their games are distributed globally (https://moxstudio.net/) through major platforms including the Apple App Store and Google Play Store, ensuring broad market penetration across different geographical regions and user demographics. This wide distribution approach helps in building a diverse user base and creating multiple entry points for new players into their ecosystem. Recent expansions into telegram-based gaming (https://www.linkedin.com/company/moxstudio/) have further diversified their distribution channels, allowing them to tap into existing social media communities and create new engagement opportunities. Games like HUG Monsters and Radix Fusion demonstrate their ability to adapt their business model to different platforms while maintaining their core P2E mechanics. Ecosystem Integration A key aspect of Mox Studio's business model is their deep integration with the Radix DLT ecosystem. The company actively collaborates (https://moxstudio.net/about/) with other projects within the Radix ecosystem, featuring their mascots and tokens within Mox Studio games. This strategy serves multiple purposes: it enhances the gaming experience, promotes wider adoption of the Radix platform, and creates mutual value through cross-project collaboration. Their partnership approach extends beyond simple integration, as demonstrated by games like OciGravity and Ociswap Hooks, which feature the mascot of the Ociswap decentralized exchange. These collaborations create a network effect within the Radix ecosystem, potentially driving user adoption across multiple platforms and services. Sustainability Model The company's approach to business sustainability is reflected in their token distribution model. The MOX token allocation (https://moxstudio.net/token/) includes specific portions for marketing, team development, P2E rewards, and staking rewards, creating a balanced economic system that supports long-term growth. The allocation of 20% of tokens for P2E rewards and another 20% for staking rewards demonstrates a commitment to maintaining player engagement while ensuring economic sustainability. Market Position As a relatively new entrant in the blockchain gaming space, having launched in 2023 (https://www.linkedin.com/company/moxstudio/) , Mox Studio has positioned itself as a specialized developer focused on the Radix DLT ecosystem. Their business model reflects this specialization, with a clear focus on creating high-quality games that showcase the capabilities of blockchain technology while maintaining traditional gaming appeal. This positioning allows them to serve both conventional gamers and cryptocurrency enthusiasts, potentially expanding their market reach beyond traditional gaming audiences. Team Leadership and Development Mox Studio operates with a core team of professionals (https://moxstudio.net/) specializing in various aspects of game development, design, and technical operations. The company maintains a relatively small but focused team structure, operating with 2-10 employees (https://www.linkedin.com/company/moxstudio/) as of 2024. Key Personnel The studio's development and operations are led by Baris Tasmaz (https://moxstudio.net/about/) , who serves as the Game Designer & Developer. Tasmaz brings extensive experience in game development and design, playing a crucial role in shaping the studio's technical direction and creative vision. Ayse Tasmaz (https://moxstudio.net/about/) holds the position of Designer & Product Manager, overseeing the studio's product development strategy and design implementations. Her role encompasses managing the overall product lifecycle and ensuring design cohesion across the studio's various projects. The technical testing and quality assurance department is headed by Eyup Akcelik (https://moxstudio.net/about/) , who serves as the Senior Software Test Engineer. Akcelik's responsibilities include ensuring the reliability and performance of the studio's games across various platforms. Mehmet Topalak (https://moxstudio.net/about/) functions as the App Support & Test Specialist, focusing on maintaining application quality and providing technical support for the studio's various gaming products. His role is crucial in ensuring smooth operation of the games and addressing technical issues that may arise. The web development aspects of the studio are handled by Gencay Kocak (https://moxstudio.net/about/) , who serves as the Web Developer. Kocak's work is essential in maintaining and developing the studio's web-based gaming platforms and online presence. Corporate Structure The company operates from its headquarters in Dalaman, Muğla, Turkey (https://www.linkedin.com/company/moxstudio/) , maintaining a flexible work environment that accommodates both local and remote team members. This structure allows the studio to access talent globally while maintaining operational efficiency. Team Culture The team's approach is guided by several core principles (https://moxstudio.net/about/) that shape their work culture. These include a commitment to integrity in all interactions, a passion for bringing original ideas to life, and a focus on excellence in game development and customer support. The team emphasizes innovation and community building, fostering an environment that encourages creative thinking and collaborative problem-solving. The studio's personnel structure reflects its evolution from its founding team's extensive experience in digital product management, marketing, and IT. This background has influenced the team's approach to combining traditional gaming elements with blockchain technology, creating a unique blend of technical expertise and creative game development capabilities. ## Minnie URL: https://radix.wiki/ecosystem/minnie Updated: 2026-02-06 Summary: ## About Minnie 🐶 About Minnie 🐶 Dan Hughes, the founder of Radix, owns a fluffy white Bichon called Minnie. Minnie is super cute and runs almost as fast as swaps on the Cassandra testnet. As a Radix enthusiast to the core, Minnie is occasionally allowed to follow Dan to the office - to where the magic happens. How to buy $MNI Ociswap - ociswap.com (https://ociswap.com/) Caviarnine - https://www.caviarnine.com/trade (https://www.caviarnine.com/trade) DefiPlaza - https://radix.defiplaza.net/swap (https://radix.defiplaza.net/swap) ## Maya Protocol URL: https://radix.wiki/ecosystem/maya-protocol Updated: 2026-02-06 Summary: Maya Protocol is a decentralized cryptocurrency exchange ecosystem that enables cross-chain asset trading without relying on wrapped tokens or traditional bridg Maya Protocol is a decentralized cryptocurrency exchange ecosystem (https://docs.mayaprotocol.com/introduction/readme/getting-started) that enables cross-chain asset trading without relying on wrapped tokens or traditional bridge infrastructure. Launched in 2024, the protocol consists of two main components: MAYAChain, which is currently operational, and AZTECChain, which is under development. https://youtu.be/f8k2tx2NFi4 (https://youtu.be/f8k2tx2NFi4) Overview MAYAChain functions as an Automated Market Maker (AMM) (https://docs.mayaprotocol.com/) similar to Uniswap, but with the distinctive capability to utilize cross-chain liquidity. This approach differs from conventional cross-chain exchanges that typically rely on wrapping assets and bridging, methods that have proven vulnerable to security breaches. According to Chainalysis data cited in the protocol's documentation, bridge hacks accounted for $2 billion in losses in 2022, representing 69% of all DeFi hacks. The protocol is a friendly fork of THORChain and employs a unique security model that manages funds directly in on-chain vaults, protected through economic security mechanisms. This is achieved through the implementation of three core technologies: the Tendermint consensus engine, Cosmos-SDK state machine, and GG20 Threshold Signature Scheme (TSS). As of September 2024, Maya Protocol has successfully integrated with multiple major blockchain networks (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) , including Ethereum, Arbitrum, and most recently, the Radix network. The protocol averages approximately $4 million in daily trading volume, providing a significant gateway for users to move assets between different blockchain ecosystems. Technical Foundation Maya Protocol's architecture is built on the principle of trustless cross-chain trading. Instead of using wrapped tokens or traditional bridges (https://docs.mayaprotocol.com/introduction/readme/getting-started) , the protocol implements a sophisticated system of continuous liquidity pools that enable direct asset exchanges across different blockchains. This approach minimizes counterparty risk and reduces the potential points of failure often associated with cross-chain bridges. The protocol utilizes a three-token system to facilitate its operations: - $CACAO serves as the primary token (https://docs.mayaprotocol.com/introduction/readme/cacao) of the ecosystem, used for transaction fees and as the settlement asset in liquidity pools. All $CACAO was allocated through a fair liquidity auction, with no team allocation. - MAYA functions as a revenue share token (https://docs.mayaprotocol.com/introduction/readme/cacao) , with holders receiving 10% of all swap and transaction fees on MAYAChain in the form of daily $CACAO rewards. - AZTEC is planned as a revenue share token (https://docs.mayaprotocol.com/introduction/readme/cacao) for the upcoming AZTECChain, designed to distribute 10% of all transaction fees to token holders. Recent Developments A significant milestone in Maya Protocol's evolution was the implementation of Streaming Swaps (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) , a feature that improves price execution by breaking large trades into smaller components. This mechanism operates similarly to Time Weighted Average Price (TWAP) trades but with a 24-hour time limit, allowing for better price discovery and reduced slippage. The protocol has also introduced innovative features such as Impermanent Loss Protection (https://docs.mayaprotocol.com/deep-dive/how-it-works/impermanent-loss-protection-ilp) for liquidity providers and a sophisticated economic model (https://docs.mayaprotocol.com/deep-dive/how-it-works/dynamic-inflation) that includes dynamic inflation mechanisms to maintain system stability. Maya Protocol represents a significant advancement in decentralized cross-chain trading infrastructure, offering solutions to many of the security and efficiency challenges that have historically plagued cross-chain asset transfers in the cryptocurrency ecosystem. Architecture Maya Protocol's technical architecture is built on several interconnected components that enable secure cross-chain asset trading. At its core, the protocol utilizes three fundamental technologies: the Tendermint consensus engine, Cosmos-SDK state machine, and GG20 Threshold Signature Scheme (TSS). Bifröst Protocol The Bifröst Protocol serves as the foundation for cross-chain communication (https://docs.mayaprotocol.com/deep-dive/how-it-works/technology) , implementing a system of one-way state pegs. Each node operates a "Bifröst" service that manages chain-specific transactions and converts them into standardized witness transactions for the MAYAChain network. This process ensures that transactions maintain consistent parameters across different blockchain architectures, whether they originate from UTXO-based chains like Bitcoin or account-based chains like Ethereum. State Machine The MAYAChain state machine (https://docs.mayaprotocol.com/deep-dive/how-it-works/technology) coordinates asset exchange logic and manages outgoing transactions. It processes finalized transactions through several stages: - Transaction ordering - State change computation - Delegation to appropriate outbound vaults - Creation and storage of transaction output items in the Key-Value store Threshold Signature Scheme (TSS) The TSS implementation (https://docs.mayaprotocol.com/deep-dive/how-it-works/eli5/what-is-threshold-signature-scheme-tss) represents a significant advancement over traditional multisig approaches. Rather than requiring multiple signatures on a single transaction, TSS enables nodes to collectively forge a vault's lock through a modular process where each node shapes a part of the lock corresponding to its key. This approach offers several advantages: - Lower transaction costs - Blockchain-agnostic implementation - Enhanced privacy as TSS transactions are indistinguishable from standard transactions Vault System Maya Protocol employs two types of vaults (https://docs.mayaprotocol.com/deep-dive/how-it-works/technology) to manage assets: - Asgard TSS Vaults: These serve as primary inbound vaults, operated by large committees (27-of-40 nodes). - Yggdrasil Vaults: Operating as secondary vaults, these handle outgoing transactions and are managed by individual nodes. The system implements a sharding mechanism for Asgard vaults when the network exceeds 40 nodes, allowing for horizontal scaling while maintaining security. Continuous Liquidity Pools The protocol's Continuous Liquidity Pools (CLP) (https://docs.mayaprotocol.com/deep-dive/mayachain-finance/continuous-liquidity-pools) represent a sophisticated evolution of the traditional automated market maker model. The CLP implements a slip-based fee model that responds dynamically to liquidity demand, calculated using the formula: y = (xYX)/(x+X)² Where: - x represents the input amount - X represents the input balance - Y represents the output balance - y represents the output amount This model offers several key benefits: - Always-available liquidity for all supported assets - Transparent, fair pricing without centralized intermediaries - On-chain price feeds for internal and external use - Democratized arbitrage opportunities - Fee convergence to zero as demand subsides Security Features The protocol implements multiple layers of security measures to protect assets and maintain network stability: Conformation Counting To guard against double-spend attacks and chain reorganizations (https://docs.mayaprotocol.com/deep-dive/how-it-works/security) , the protocol employs dynamic confirmation requirements based on transaction value. This is particularly important for chains without instant finality. Solvency Verification The protocol includes an automatic solvency checker (https://docs.mayaprotocol.com/deep-dive/how-it-works/security) that operates in two modes: - Reactive: Continuously compares expected vault balances against actual chain wallet amounts - Proactive: Validates transaction solvency before signing to prevent potential insolvency Transaction Throttling To prevent rapid asset drainage (https://docs.mayaprotocol.com/deep-dive/how-it-works/security) , the protocol implements outbound transaction throttling with a maximum value limit per block (currently 1000 $CACAO worth). Large outbound transactions are automatically spread across multiple blocks, up to 720 blocks (approximately one hour), providing time for security measures to engage if necessary. Governance Mechanisms The protocol employs a minimalist governance approach (https://docs.mayaprotocol.com/deep-dive/how-it-works/governance) through the Mimir system, which allows for adjustments to network parameters without requiring direct node communication. This system manages: - Asset listing and delisting - Chain integration processes - Protocol upgrades - Economic parameters - Emergency response procedures The governance system is deliberately limited to prevent nodes from forming coalitions that could compromise network security, while still maintaining necessary protocol flexibility and upgrade capabilities. This technical infrastructure has enabled Maya Protocol to successfully integrate with multiple major blockchain networks, including its recent integration with the Radix network (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) , demonstrating the system's adaptability and robust cross-chain capabilities. Native Tokens Maya Protocol operates using a three-token system, each serving distinct purposes within the ecosystem. The protocol's token architecture is designed to align incentives between different stakeholders while maintaining economic stability. $CACAO $CACAO serves as the primary token of the Maya Protocol ecosystem (https://docs.mayaprotocol.com/introduction/readme/cacao) , with a total supply of 100 million tokens. The token's distribution was conducted through a liquidity auction that allocated 90% of the total supply, with the remaining 10% designated for the Impermanent Loss Protection treasury. Notably, the development team received no direct allocation of $CACAO tokens. $CACAO serves three essential roles in the protocol: - Liquidity Function: $CACAO acts as a settlement asset (https://docs.mayaprotocol.com/introduction/readme/cacao) in the protocol's liquidity pools, where it is paired equally (50:50) with external assets. This mechanism allows the protocol to maintain awareness of asset values it secures through arbitrage-driven price discovery. For every $1 million in multi-chain assets pooled, an equivalent amount of $CACAO is required in the pools. - Security Mechanism: The token provides sybil resistance (https://docs.mayaprotocol.com/introduction/readme/cacao) through a Proof of Bond system, where nodes must commit $CACAO as collateral. This bond serves both to identify nodes and to underwrite assets in the pools. If a node attempts to steal assets, their bond is deducted at 1.5 times the stolen amount to make the affected pools whole. - Economic Incentives: $CACAO facilitates the protocol's economic operations (https://docs.mayaprotocol.com/introduction/readme/cacao) through: - Transaction fee payment - Gas cost subsidization - Reward distribution to network participants $CACAO's value structure (https://docs.mayaprotocol.com/introduction/readme/cacao) includes both a deterministic component and a speculative element. The protocol's 1:1 pool ratio mechanism ensures that $CACAO's market capitalization maintains a minimum value equal to the non-$CACAO asset Total Value Locked (TVL) in the protocol. MAYA MAYA functions as a revenue-sharing token (https://docs.mayaprotocol.com/introduction/readme/cacao) with a fixed supply of 1 million tokens. Its primary purpose is to distribute protocol revenue to token holders while providing initial funding for protocol development. The MAYA token distribution follows this structure: - 15.6% allocated to founders (non-transferable) - 7% to RUNE token holders - 7% to early nodes - 7% to Tier 1 liquidity providers - 1% to Maya Mask NFT holders - 78% to the development fund MAYA token holders receive 10% of all protocol revenue (https://docs.mayaprotocol.com/introduction/readme/cacao) in the form of daily $CACAO distributions. This mechanism ensures that for every $9 earned by liquidity providers and nodes, MAYA holders, including the development team, earn $1, creating alignment between protocol development and long-term value creation. Unlike many protocol tokens, MAYA is not designed for active trading (https://docs.mayaprotocol.com/introduction/readme/cacao) and does not have native liquidity pools on the Maya Protocol. While external exchanges may choose to list the token, its primary function remains revenue distribution rather than trading. AZTEC AZTEC is a planned revenue-sharing token (https://docs.mayaprotocol.com/introduction/readme/cacao) designed for the forthcoming AZTECChain component of the protocol. Following a similar model to the MAYA token, AZTEC will capture 10% of all transaction fees generated on AZTECChain. The AZTEC token is scheduled for launch alongside AZTECChain, which will serve as a smart contract platform within the Maya Protocol ecosystem. AZTECChain will be built as a fork of the Cosmos Hub (https://docs.mayaprotocol.com/) , leveraging established infrastructure and the Cosmos smart contract development ecosystem. When operational, AZTECChain will enable advanced features such as: - Algorithmic stablecoins - Synthetic assets - CEX-style order book trading - Additional DeFi capabilities The protocol's documentation emphasizes that neither chain will artificially inflate yield to drive demand for their respective stablecoins or derivatives, maintaining a focus on sustainable economic design. Features and Services Cross-chain Swaps Maya Protocol's primary value proposition (https://docs.mayaprotocol.com/introduction/readme/roles/swapping) is enabling users to swap digital assets across different blockchain networks without traditional bridging or wrapping mechanisms. The protocol aims to provide superior user experience through open finance protocols and permissionless access to fast chains (like Dash), smart contract chains (such as Ethereum and Kujira), and censorship-resistant chains (like Bitcoin). A notable innovation in the protocol's swap mechanics (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) is the implementation of Streaming Swaps, which break large trades into multiple smaller components over time. This feature operates similarly to Time Weighted Average Price (TWAP) trades but with a 24-hour time limit, offering two key benefits: - The interval component allows arbitrageurs time to rebalance between sub-swaps - The quantity component reduces individual swap sizes to minimize price impact The protocol processes swaps (https://docs.mayaprotocol.com/introduction/readme/roles/swapping) through its Continuous Liquidity Pools, where each transaction involves: - Converting the input asset to $CACAO - Moving $CACAO between pools - Converting $CACAO to the desired output asset The entire process is handled atomically by the state machine, ensuring users never directly interact with $CACAO. Swap costs comprise two components (https://docs.mayaprotocol.com/introduction/readme/roles/swapping) : - A dynamic outbound fee based on network gas costs - Price slippage determined by trade size relative to pool depth Liquidity Provision Liquidity providers (LPs) can participate (https://docs.mayaprotocol.com/introduction/readme/roles/liquidity-providers) in the protocol by depositing assets into liquidity pools. The protocol supports both symmetrical (equal value of two assets) and asymmetrical (unequal values) deposits, though symmetrical additions are recommended for optimal performance. Maya Protocol implements an innovative Impermanent Loss Protection (ILP) mechanism (https://docs.mayaprotocol.com/deep-dive/how-it-works/impermanent-loss-protection-ilp) funded from the protocol reserve, which contains 10% of the total $CACAO supply. Key features include: - Protection begins 50 days after liquidity deposit - Coverage reaches 100% after 150 days if the asset outperforms $CACAO - Coverage reaches 100% after 450 days if $CACAO outperforms the asset - Protection is calculated at withdrawal time - Coverage accrues at 1% daily for outperforming assets and 0.25% daily for underperforming assets Liquidity providers earn returns through multiple streams (https://docs.mayaprotocol.com/introduction/readme/roles/liquidity-providers) : - Swap fees from trading activity - Block rewards from protocol emissions - Continuous income from the protocol's token reserve - Additional rewards based on the Incentive Pendulum mechanism Savings Program The protocol offers a savings feature (https://docs.mayaprotocol.com/deep-dive/mayachain-finance/savings) that allows users to earn yield with single-sided asset exposure using synthetic assets. This program operates through a two-step process: - Users mint synthetic versions of their assets - These synthetic assets are locked in a savings vault, with users receiving Saver Units representing their ownership Savers receive 50% of the yield generated by the synthetic collateral (https://docs.mayaprotocol.com/deep-dive/mayachain-finance/savings) , with the remaining 50% distributed to liquidity providers. This yield comes from: - Swap fees generated by the underlying liquidity - Protocol rewards - Additional protocol incentives Synthetic Assets Maya Protocol's synthetic asset model (https://docs.mayaprotocol.com/deep-dive/mayachain-finance/synthetic-asset-model) differs from traditional approaches by implementing a hybrid system that is both fully collateralized during existence and 1:1 pegged at redemption. This unique design provides capital efficiency while maintaining price stability. Synthetic assets are backed by constant-product liquidity, with: - 50% collateralization in the underlying asset - 50% collateralization in $CACAO - Pool rebalancing along a price curve to maintain stability - Price shifts subsidized by pool liquidity To manage high demand for synthetic assets (https://docs.mayaprotocol.com/deep-dive/mayachain-finance/synthetic-asset-model) , the protocol implements a Protocol-Owned Liquidity (POL) mechanism where the reserve can add $CACAO to pools when synthetic utilization approaches capacity limits. This helps maintain synthetic asset stability while protecting liquidity providers from excessive leverage exposure. The protocol continuously monitors synthetic asset utilization and automatically adjusts POL positions to maintain optimal market conditions and risk parameters. Governance Maya Protocol employs a minimalist governance approach designed to maintain network security while enabling necessary protocol adjustments. The system deliberately limits governance scope (https://docs.mayaprotocol.com/deep-dive/how-it-works/governance) to prevent nodes from communicating or learning each other's identities, which could compromise network security through potential collusion. Asset Management The protocol implements a permissionless asset listing system (https://docs.mayaprotocol.com/deep-dive/how-it-works/governance) where users signal demand for new assets by staking in new pools. When the network identifies a new asset, it creates a pool in bootstrap mode, during which swapping is disabled. The selection process occurs periodically, with the network evaluating all bootstrapping pools and listing the one with the highest value. Assets can be delisted through two primary mechanisms (https://docs.mayaprotocol.com/deep-dive/how-it-works/governance) : - Complete liquidity withdrawal by all providers - Pool depth falling below minimum requirements When a new bootstrap pool is enabled, its depth is compared to existing active pools. If the new pool has greater depth, it may replace the smallest active pool, which returns to bootstrap mode. Chain Integration The process for adding new blockchain support (https://docs.mayaprotocol.com/deep-dive/how-it-works/governance) follows a structured approach: - Community developers create a new Bifröst module - Proposal submission through a MAYAChain Improvement Proposal (MIP) - Review and validation by core developers - Integration into MAYANode software - Network upgrade through node rotation - Chain activation upon 67% node adoption Chain removal follows a similar threshold mechanism, where support is discontinued when 67% of nodes stop monitoring a chain, initiating an automated process to return assets to their owners. Protocol Upgrades The upgrade process (https://docs.mayaprotocol.com/deep-dive/how-it-works/governance) encompasses three main components: - Application logic (blockchain operation) - Schema (vault key-value storage) - Network software (TSS protocol management) Upgrades are implemented through an asynchronous process where nodes can update their software during regular network churn cycles. The network automatically activates new features when 67% of nodes adopt the latest version, ensuring smooth transitions without disrupting consensus. Mimir System The Mimir feature (https://docs.mayaprotocol.com/deep-dive/how-it-works/constants-and-mimir) provides flexible control over network parameters through two distinct mechanisms: Node Mimir - Requires two-thirds majority of active nodes for implementation - Only counts votes from currently active nodes - Used for routine parameter adjustments Admin Mimir - Temporary override capability for testing purposes - Cannot control funds or critical security parameters - Planned for eventual removal from the protocol Economic Parameters The protocol maintains several key economic constants (https://docs.mayaprotocol.com/deep-dive/how-it-works/constants-and-mimir) that can be adjusted through governance, including: - Emission curve for $CACAO distribution - Incentive curve for reward allocation - Maximum available pools - Minimum $CACAO pool depth - Pool cycle duration Emergency Procedures Emergency governance actions (https://docs.mayaprotocol.com/deep-dive/how-it-works/governance) are intentionally difficult to coordinate due to the protocol's emphasis on node anonymity. The primary emergency mechanism is Ragnarök, which triggers when node count falls below four, initiating an automated fund distribution process and system shutdown. The protocol includes multiple halt mechanisms (https://docs.mayaprotocol.com/deep-dive/how-it-works/security) for emergency situations: - Individual node pause capability (limited to 720 blocks, approximately one hour) - Cumulative halt extension through multiple node participation - Chain-specific trading halts - Automatic solvency-triggered safety measures These governance mechanisms have enabled Maya Protocol to successfully integrate with multiple blockchain networks while maintaining security and operational efficiency. The recent integration with the Radix network (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) demonstrates the effectiveness of this governance model in facilitating network expansion while preserving decentralized control. Ecosystem User Interfaces Maya Protocol operates as backend infrastructure, requiring user interfaces for interaction. The protocol supports multiple decentralized exchanges and wallet interfaces (https://docs.mayaprotocol.com/introduction/maya-ecosystem/user-interfaces-and-wallets) to provide users with various access points to its services. ThorWallet DEX serves as a primary interface (https://docs.mayaprotocol.com/introduction/maya-ecosystem/user-interfaces-and-wallets) offering both web and mobile access. Key features include: - Support for all Maya Protocol-compatible Layer 1 blockchains - Native $CACAO and MAYA token management - Cross-chain swap functionality - Liquidity position management - Integration with hardware wallets like Ledger - Comprehensive saver position management El Dorado provides a web-based interface (https://docs.mayaprotocol.com/introduction/maya-ecosystem/user-interfaces-and-wallets) with distinctive features: - Support for all Maya-supported Layer 1 blockchains - Unique Polkadot blockchain integration - Compatible with XDEFI and Keystore wallets - Full liquidity management capabilities $CACAOSwap offers specialized features (https://docs.mayaprotocol.com/introduction/maya-ecosystem/user-interfaces-and-wallets) including: - Integration with multiple wallet types including Keystore, MetaMask, XDEFI, Keplr, and Leap - Cross-chain swap functionality - Liquidity and saver position management - Support for all Maya-supported Layer 1 blockchains Infrastructure Tools MayaScan serves as the primary blockchain explorer (https://docs.mayaprotocol.com/blockchain-explorer/mayascan) for Maya Protocol, offering: - Comprehensive transaction tracking - Swap and liquidity monitoring - Network status oversight - Node performance tracking - Secure peer-to-peer messaging capability - Native token and NFT trading support through Ordinals technology and Memos The Maya Info Bot provides real-time network monitoring (https://docs.mayaprotocol.com/introduction/maya-ecosystem/tools) across multiple channels (Telegram, Discord, and X), tracking: - Large asset transfers - Significant swap transactions - Liquidity pool statistics - $CACAO price movements - Mimir parameter adjustments - Network alerts and updates Mayans.app combines social media elements with decentralized finance (https://docs.mayaprotocol.com/introduction/maya-ecosystem/tools) , offering: - Secure private messaging - MRC-20 token trading and staking - DeFi gaming integration - Market trend signals - Social network features for the Maya community Digital Assets Maya Masks represents the protocol's official NFT initiative (https://docs.mayaprotocol.com/deep-dive/maya-masks) , consisting of 1,689 Genesis Maya Masks. The collection features: - Two distinct categories: Golden Masks and regular Masks - Integration with Web3 avatars - Utility features including: - $CACAO staking rewards (planned for AZTECChain launch) - Community event access - $MAYA and $AZTEC token airdrops (4.5 tokens each per mask) - Enhanced benefits for Golden Mask holders The ecosystem supports multiple wallet solutions (https://docs.mayaprotocol.com/introduction/maya-ecosystem/user-interfaces-and-wallets) : XDEFI Wallet - Multi-ecosystem support for over 30 native blockchains - EVM and Cosmos chain compatibility - Hardware wallet integration with Ledger and Trezor - Native support for Bitcoin, Ethereum, Solana, and other major networks Hardware Wallet Support KeepKey integration (https://docs.mayaprotocol.com/introduction/maya-ecosystem/user-interfaces-and-wallets) provides: - Secure storage for $CACAO and $MAYA tokens - Native cross-chain swap capability in firmware - Support for Maya Layer 1 assets - Integration with multiple EVM and Cosmos chains Recent Developments The ecosystem continues to expand, with the recent integration of the Radix network (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) marking a significant milestone. This integration enables trustless cross-chain asset swaps between Radix and other supported networks, providing new opportunities for asset flow and ecosystem growth. The integration is particularly notable for enabling access to Maya Protocol's $63 billion TVL across supported chains for Radix users, while maintaining the protocol's commitment to security and decentralization. Technical Infrastructure Node Operation The Maya Protocol network is serviced by MAYANodes (https://docs.mayaprotocol.com/introduction/readme/roles/node-operators) , which are designed to operate in a decentralized and anonymous manner. The network initially targets 120 nodes, each comprising multiple independent servers that work cooperatively to facilitate cross-chain swapping capabilities. The node infrastructure employs several innovative features (https://docs.mayaprotocol.com/) to maintain decentralization and security: - Capped Proof of Bond validator selection to maintain a high Nakamoto Coefficient - Periodic Validator Churning every 5 days to prevent stagnation - Asynchronous Network Upgrades allowing gradual transition to new protocol versions - Chain-agnostic Bifrost Protocol for managing various blockchain connections Maya Protocol implements a unique security approach (https://docs.mayaprotocol.com/introduction/readme/roles/node-operators) for node operations. Unlike traditional staking systems, nodes must directly pay for their bond rather than accepting delegation. This requirement ensures that economic security assumptions remain valid, as a node operator who pays $1 million for their bond would only attempt theft if they could access more than $1 million in funds. While public delegation is not permitted, the protocol allows for private delegation (https://docs.mayaprotocol.com/introduction/readme/roles/node-operators) under specific conditions: - Limited to 6 bonders per node - Requires direct whitelisting by node operators - Assumes trust relationships between operators and bonders - Maintains identical network operation regardless of delegation status Liquidity Nodes Model The Liquidity Nodes model (https://docs.mayaprotocol.com/deep-dive/how-it-works/liquidity-nodes) represents a significant innovation in blockchain infrastructure, addressing common issues with traditional staking systems. Instead of requiring idle staked funds, nodes bond Liquidity Pairs, offering three key advantages: The model maintains slashing mechanisms (https://docs.mayaprotocol.com/deep-dive/how-it-works/liquidity-nodes) while requiring over 75% of capital to be bonded by nodes, with an ideal target of 87% to prevent Sybil attacks. Slashed funds are transferred to Protocol Owned Liquidity, with provisions for both manual and automatic forgiveness. The system distributes two types of rewards: - Node Exclusive Rewards (NER): Reserved for liquidity bonded to MAYANodes - Liquidity Pool Rewards (LR): Distributed to all liquidity providers based on their share The model achieves superior capital efficiency (https://docs.mayaprotocol.com/deep-dive/how-it-works/liquidity-nodes) through: - Simultaneous earning of node rewards and liquidity rewards - Reduced bonding risk through continued liquidity rewards during standby states - Creation of deeper liquidity pools while maintaining network security - Implementation of a self-reinforcing liquidity flywheel effect Economic Model The protocol implements a dynamic inflation mechanism (https://docs.mayaprotocol.com/deep-dive/how-it-works/dynamic-inflation) that can be enabled or disabled through node voting. This feature activates when $CACAO held outside liquidity pools exceeds 10% of the non-reserve supply. The inflation rate follows the formula: Rate of inflation = (1-y) * 40% + 1% where y represents the percentage of $CACAO in pools relative to total supply. The protocol's economic balance (https://docs.mayaprotocol.com/deep-dive/how-it-works/incentive-pendulum) is maintained through an Incentive Pendulum mechanism that adjusts rewards between nodes and liquidity providers based on network conditions: - Optimal State: Maintains roughly 85% bonded LP and 15% pooled LP - Unsafe State: Triggers when pooled capital exceeds 25% of bonded capital - Automatic adjustment of reward distribution to maintain economic security Network Security The protocol implements multiple security layers (https://docs.mayaprotocol.com/deep-dive/how-it-works/security) : Proactive Measures - Conformation counting for double-spend protection - Outbound transaction throttling - Automated solvency checking - Unauthorized transaction detection - Security event flagging Reactive Controls - Node operator triggered halts - Chain-specific trading suspensions - Emergency shutdown procedures - Automatic security alerting system These infrastructure components work together to enable Maya Protocol's cross-chain functionality while maintaining security and efficiency. The recent successful integration with the Radix network (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) demonstrates the robustness of this infrastructure in supporting network expansion while maintaining operational security. Future Development Roadmap Overview Maya Protocol's development roadmap for 2024 focuses on expanding chain integrations, improving user experience, and introducing new technical features. The protocol has already completed several major objectives (https://docs.mayaprotocol.com/introduction/roadmap-2024) , including: Completed Initiatives - Savers functionality, enabling users to earn fees without direct $CACAO exposure - Arbitrum integration, allowing Ethereum assets on the Arbitrum chain to participate in swaps and liquidity actions - Streaming Swaps implementation, which improves large trade execution by breaking them into smaller components - Radix network integration (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) , enabling trustless cross-chain asset swaps and creating new routes for TVL flow AZTECChain Development A significant component of Maya Protocol's future development (https://docs.mayaprotocol.com/) is AZTECChain, a smart contract platform being developed as a fork of the Cosmos Hub (Gaia). This initiative will enable: - Algorithmic stablecoin implementation - Derivatives such as Synths - CEX-style order book trading - Enhanced DeFi capabilities The protocol emphasizes a conservative approach to these features, noting that algorithmic stablecoins will be delayed upon launch to ensure proper economic design and testing through bounties. Additionally, neither chain will subsidize yield to artificially inflate demand for stablecoins or derivatives. Planned Integrations The protocol's roadmap (https://docs.mayaprotocol.com/introduction/roadmap-2024) includes several major blockchain integrations: THORChain DEX Aggregation - Enabling swaps between THORChain and MAYAChain assets - Creating broader liquidity networks across protocols Privacy-Focused Integration - Planned Zcash integration to support $ZEC deposits - Expansion into privacy-preserving transaction capabilities Additional Network Support - Cardano integration for $ADA token support - Memoless transaction capabilities to expand wallet and chain compatibility - Additional blockchain integrations to be announced Technical Enhancements The development team is working on several technical improvements to enhance the protocol's functionality: Memoless Transactions This feature (https://docs.mayaprotocol.com/introduction/roadmap-2024) will expand the protocol's compatibility by: - Enabling integration with chains that don't use memos - Broadening potential wallet integrations - Simplifying user experience across different blockchain ecosystems Infrastructure Scaling The protocol's architecture (https://docs.mayaprotocol.com/deep-dive/how-it-works/technology) allows for continuous improvement through: - Asynchronous network upgrades - Sharded Asgard vault scaling - Enhanced cross-chain communication protocols Economic Evolution The protocol's economic model (https://docs.mayaprotocol.com/deep-dive/how-it-works/dynamic-inflation) continues to evolve through: - Refinement of the dynamic inflation mechanism - Enhancement of the Incentive Pendulum system - Development of sustainable yield generation methods Integration Impact The recent Radix integration (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) demonstrates Maya Protocol's commitment to expanding cross-chain liquidity and accessibility. This integration has: - Enabled trustless cross-chain asset swaps with Radix - Created new routes for TVL flow across supported chains - Provided Radix ecosystem participants access to Maya Protocol's liquidity - Enhanced key network metrics including TVL, weekly transactions, and user acquisition The successful implementation of Streaming Swaps (https://www.radixdlt.com/blog/maya-protocol-integration-is-now-complete) during this period also showcases the protocol's ability to introduce complex features while maintaining security and efficiency. This feature has proven particularly valuable for large trades, as it: - Reduces slippage through trade size optimization - Improves price execution through temporal distribution - Allows arbitrageurs to maintain pool balance during execution These developments suggest a strong foundation for future growth and integration, with the protocol positioned to continue expanding its cross-chain capabilities while maintaining its commitment to security and decentralization. ## MattiaNode URL: https://radix.wiki/ecosystem/mattianode Updated: 2026-02-06 Summary: MattiaNode is a validator on the Radix network created by Mattia, an active contributor to the Radix community. MattiaNode is a validator on the Radix network created by Mattia (https://www.notion.so/Mattia-4edffe051d294347beb7feb147662129?pvs=21) , an active contributor to the Radix community. Overview MattiaNode validator is a node used in the Radix network. It is a self-hosted and reliable validator node located in Italy. The MattiaNode validator is known for its decentralized setup, as it is not hosted on popular cloud providers like AWS or Azure. It runs on a beefy self-hosted server in Italy, equipped with an AMD Ryzen processor, 48GB of DDR4 RAM, 1TB of NVMe SSD storage, and a 1Gbit/s fiber connection. Choosing a validator node is crucial for staking $XRD on the Radix network. MattiaNode offers geographical decentralization and provider decentralization, making it an attractive option for individuals looking to stake their $XRD tokens. Mission The mission of MattiaNode validator is to contribute to the security, stability, and decentralization of the Radix network. As a self-hosted validator node, MattiaNode aims to provide a reliable and robust infrastructure for participating in the consensus protocol of the Radix network. By running their own validator node, the operator of MattiaNode actively contributes to securing the Radix network and ensuring the accuracy and integrity of transactions and data within the network. The node helps in validating transactions and reaching consensus on the state of the Radix ledger, making it an essential component of the network's overall functioning. In addition to its technical role, MattiaNode also plays a role in the Radix community. Mattia, the operator of the validator, has joined the RadixTalk forum as a moderator, actively engaging with the community and assisting in addressing queries related to the Radix network and its validator setup. Overall, the mission of MattiaNode validator encompasses supporting and furthering the goals of the Radix network, which include building a scalable, decentralized, and interoperable infrastructure for the next generation of decentralized applications. Benefits By joining the MattiaNode validator, participants can benefit from several advantages, including: Secure and Reliable Infrastructure As a validator node, MattiaNode provides robust and reliable infrastructure for participating in the consensus protocol of the Radix network. With its dedicated server and backup power supply, the validator node ensures uninterrupted and secure operation. Decentralization MattiaNode differentiates itself by not relying on popular cloud providers like AWS or Azure. This contributes to the decentralization of the network by diversifying the providers. Rewards Individuals can delegate their $XRD tokens to validators like MattiaNode and earn rewards for securing the network. By joining the validator, participants can earn rewards from their delegated stake. Participation in Network Governance Validators hold a stake in network governance by participating in the consensus protocol of the Radix network. As such, joining the MattiaNode validator enables individuals to contribute to the decision-making process of the network. Contribution to a Growing Ecosystem Joining the MattiaNode validator contributes to the growth of the Radix ecosystem by providing a more diversified and robust network. Participants in the validator node help increase the security and decentralization of the Radix network. Services Besides staking, MattiaNode provides additional network services to support the Radix ecosystem. These services include: Validator Infrastructure MattiaNode operates on a powerful self-hosted server located in Italy. The server is equipped with an AMD Ryzen processor, 48GB of DDR4 3600MHz RAM, 1TB of NVMe SSD storage (WD SN750), and a 1Gbit/s fiber connection. Additionally, the infrastructure is backed up with an uninterruptible power supply (UPS) and a 4G LTE redundant internet connection. Decentralized Network Operations MattiaNode prioritizes decentralization in its network operations. They do not rely on centralized cloud service providers like AWS, Azure, or Google Cloud, which helps avoid centralization and supports a more distributed network3. Geographical Decentralization MattiaNode runs its node in Italy, contributing to the geographical decentralization of the Radix network3. Green and Sustainable Energy MattiaNode's electricity supply is certified as 100% green and renewable. It is fed into the grid and produced by plants powered by renewable energy sources. Staking process The staking process for the MattiaNode validator is fairly straightforward. Here are the general steps: Create a Radix Wallet First, you'll need to create a Radix Wallet to store your $XRD tokens. You can create a wallet using the Radix Desktop Wallet or the Radix Web Wallet. Purchase $XRD Tokens Next, you'll need to purchase $XRD tokens from a cryptocurrency exchange that supports $XRD, such as Bitmax, Kucoin or Uniswap. You can then transfer these tokens to your Radix Wallet. Delegate $XRD Tokens Once your $XRD tokens are in your Radix Wallet, you can delegate them to the MattiaNode validator by using the Radix Wallet's delegation feature. You'll need to enter the validator's node address into the delegation form and set the amount of $XRD tokens you'd like to delegate. Earn Rewards The more $XRD tokens you delegate to the MattiaNode validator, the higher the potential rewards you can earn. Validator rewards are distributed based on the amount of tokens delegated, and the validator's performance and participation in the consensus protocol. Monitor Your Delegation It's important to monitor your delegation and the performance of the MattiaNode validator. You can check the status of your delegation and your rewards using the Radix Wallet's delegation dashboard. Validators may occasionally experience downtime or missed blocks, which could impact your rewards. Security and Trust Security and trust are essential considerations when choosing a validator like MattiaNode. Here are some key factors to understand about the security and trustworthiness of the MattiaNode validator: Reputation Building trust in the blockchain space often relies on a validator's reputation. Validators with a solid reputation are more likely to have a track record of reliability, honesty, and strong security practices. Assessing the reputation of MattiaNode can involve researching their history, community involvement, and any past contributions to the Radix ecosystem. Infrastructure and Reliability As a validator, MattiaNode is responsible for maintaining secure and reliable infrastructure to support the staking process. This includes maintaining secure server systems, implementing high availability measures, and ensuring robust network connectivity. Users should evaluate the technical capabilities and infrastructure of MattiaNode to ensure reliable staking operations. Security Measures Validators like MattiaNode should implement strict security measures to protect against potential attacks or breaches. These measures typically include secure server setups, encryption protocols, regular security audits, and best practices for handling user funds and data. Transparently communicating their security measures can enhance trust in the validator. Openness and Transparency Validators that prioritize openness and transparency can build trust with the community. This can include regularly sharing updates and reports on network participation, performance statistics, and any operational issues. Validators that actively engage with the community, provide clear communication channels, and respond promptly to inquiries contribute to trust and confidence. Community Feedback Evaluating community feedback and sentiment is another way to gauge trustworthiness. Engage with the Radix community, participate in forums, and research community sentiment and experiences with MattiaNode to get a sense of the validator's reputation and trust level. ## Lucky8 🍀 URL: https://radix.wiki/ecosystem/lucky8 Updated: 2026-02-06 Summary: Lucky8 is a project that involves both the Radix network and a lossless lottery. In this lottery, participants are required to stake a minimum amount of 1000 $X Lucky8 is a project that involves both the Radix network and a lossless lottery. In this lottery, participants are required to stake a minimum amount of 1000 $XRD into a validator node on the Radix network. By staking, participants earn $LUCK tokens on their staked assets each month and become eligible for a monthly prize draw. The project aims to provide a win-win situation for participants, as they have the opportunity to earn $LUCK tokens while also having a chance to win big-money Lucky8 prizes that are paid out in Radix. Partnership In addition to the prize pool, lucky8 has partnered with various projects who have contributed their tokens or NFTs as additional prizes in the lottery drawings. Some of the partners mentioned include RippyClip, Radix Inu, Radical Penguins, Radical Robos, Natty Radishes, Radix Panda, Radical Dinosaurs, RadHeads, StakeNordic, VikingLand, XRDScan, RadLads, Mutant Cat Society, PooP, and Radix Frogz Society. Rewards Lucky8 is a lossless lottery platform on Radix that allows users to stake Radix ($XRD) tokens and earn $LUCK tokens. They also conduct monthly and yearly lottery drawings where participants have the chance to win a minimum monthly jackpot of 20,000 $XRD and a minimum yearly jackpot of 120,000 $XRD. Token economics $LUCK Token The $LUCK token is the native token of Lucky8. By staking Radix ($XRD) with their Validator node, users can earn $LUCK as rewards. Farming LUCK Staking Radix ($XRD) with Lucky8 allows users to farm $LUCK tokens. The exact details of the farming process, such as the rate of token generation or the farming duration, are not explicitly mentioned in the scraped data. Monthly Jackpot The minimum monthly jackpot for the Lucky8 lottery is stated as 20,000 $XRD. However, more specific details about the jackpot distribution or how it is funded are not provided. Yearly Jackpot The minimum yearly jackpot for the Lucky8 lottery is mentioned as 120,000 $XRD. Again, the details regarding the distribution or funding of this jackpot are not specified. Spending $LUCK Tokens Users can spend their LUCK tokens to participate in the yearly drawing. Fractional $LUCK tokens are not returned, and users need to spend their tokens by May 31st, 2023, 23:59:59 UTC to be eligible for the drawing. The drawing offers a chance to win 120,000 $XRD. How to play To play the Lucky8 lottery you need to first stake at least 1000 $XRD with their Validator node. Here are the basic steps: Copy the Lucky8 validator node address Go to your Radix wallet and navigate to the Stake/Unstake section In the "Stake Tokens" tab, paste the Lucky8 validator node address into the "Validator" field Enter the amount of $XRD you want to delegate to Lucky8 (the minimum is 1000 $XRD) Click "Stake" and confirm the transaction Once you have staked at least 1000 $XRD with Lucky8, you will begin earning $LUCK tokens. These tokens can be used to participate in Lucky8’s monthly and yearly lottery drawings. ## Leafnode URL: https://radix.wiki/ecosystem/leafnode Updated: 2026-02-06 Summary: Leafnode is a Radix validator based in the Netherlands. Radix is a decentralized and secure protocol that enables developers to create scalable decentralized ap Leafnode is a Radix validator based in the Netherlands. Radix is a decentralized and secure protocol that enables developers to create scalable decentralized applications (dApps). Validators are nodes that verify transactions and propose new blocks in the Radix network. Stakers, on the other hand, are users who delegate their tokens to validators to help secure the network and receive rewards in return. Leafnode is one of the validators in the Radix network that users can delegate their tokens to. By staking their tokens with Leafnode, users can help secure the Radix network and earn a share of the validator's rewards. The amount of rewards a user can earn depends on various factors, such as the amount of tokens they stake, the duration of the staking period, and the validator's performance. Founder Daan is an enthusiastic Dutch computer science student and a member of the Ociswap (/ecosystem/ociswap) team. He expresses a strong passion for distributed ledger technology, with Radix being his favorite project in the space. Daan runs a world-class Radix validator node for users to safely stake their tokens. Daan highlights some advantages of staking with him, including being an active community member, offering a minimal fee percentage, and allowing users to earn interest on their staked XRD tokens. He actively maintains and monitors the node and has even developed his own solutions, such as a Raspberry Pi-powered dashboard. Benefits Here are some potential benefits (https://krulknul.com/) of staking with Leafnode: High Performance Leafnode emphasizes the use of high-quality Linux servers that meet or exceed the recommended system specifications by RDX (presumably referring to Radix) works. This indicates that their nodes are designed for optimal performance and reliability. High Uptime Leafnode ensures high uptime by having a secondary Linux server ready to take over if the primary node encounters any issues. Additionally, they use automation tools like Ansible to quickly deploy a new node in case of failure. Monitoring Leafnode uses Grafana for monitoring the performance and uptime of their nodes. They have also designed a Raspberry Pi-powered monitoring dashboard that provides real-time information and will be active 24/7. This commitment to monitoring ensures that the node's status and performance are consistently tracked, allowing for proactive maintenance and issue resolution. Decentralization The website mentions that too much $XRD has been staked to nodes running in American data centers. By running their node in the Dutch city of Alblasserdam and having a backup node in Germany, Leafnode aims to contribute to the decentralization of the network. Security Leafnode places importance on network security and protecting stakers' rewards. They have taken measures such as employing firewalls, implementing 2FA (two-factor authentication), and utilizing DDoS (Distributed Denial of Service) protection to safeguard the validator against potential attacks. ## launchspace URL: https://radix.wiki/ecosystem/launchspace Updated: 2026-02-06 Summary: Launchspace is an accelerator for decentralized applications (dApps) built on the Radix platform. In addition, it serves as a marketplace for Scrypto blueprints Launchspace is an accelerator for decentralized applications (dApps) built on the Radix platform. In addition, it serves as a marketplace for Scrypto blueprints and audits. This means that Launchspace provides support and resources to developers who are building dApps on the Radix platform. It also offers a platform for developers to share and sell their Scrypto blueprints and audits. Scrypto-based systems aim to enhance the security and accuracy of digital identities and transactions. Mission The mission of Launchspace is to drive the adoption and success of decentralized applications built on the Radix platform. They aim to achieve this by providing an accelerator program that supports developers in building and launching their dApps efficiently. Launchspace also serves as a marketplace for Scrypto blueprints and audits. This marketplace enables developers to share and sell their Scrypto blueprints, which are the building blocks for secure and efficient digital identities and transactions. Additionally, developers can also offer their audit services to ensure the security and reliability of dApps built on Radix. In summary, Launchspace's mission is to foster innovation and growth in the Radix ecosystem by providing the necessary resources, support, and marketplace infrastructure for developers to create and commercialize their dApps while ensuring the highest level of security and trust. Features Some of the key features of Launchspace include: Accelerator for Radix dApps Launchspace provides support and resources to developers who are building decentralized applications (dApps) on the Radix platform. This includes mentorship, technical assistance, and guidance to help accelerate the development process. Marketplace for Scrypto Blueprints Launchspace offers a marketplace where developers can share and sell their Scrypto blueprints. Scrypto is a technology that aims to enhance the security and accuracy of digital identities and transactions. Developers can showcase their blueprints and make them available for others to use as a starting point for their own dApp development. Marketplace for Audits Launchspace also serves as a marketplace for audits. This means that developers can offer their services to conduct audits of dApps built on the Radix platform. Audits help ensure the security and reliability of dApps and provide trust to users and investors. These features work together to support the development and growth of the Radix ecosystem, providing resources and opportunities for developers to create innovative and secure dApps. Blueprint A blueprint of Launchspace includes the following elements: Accelerator Program Launchspace provides an accelerator program specifically designed to support developers building decentralized applications (dApps) on the Radix platform. This program offers mentorship, technical assistance, and guidance to help accelerate the development process of dApps. Scrypto Blueprint Marketplace Launchspace serves as a marketplace where developers can share, showcase, and sell their Scrypto blueprints. These blueprints are the foundational elements for creating secure and efficient digital identities and transactions. Developers can offer their blueprints for others to use as a starting point for their own dApp development. Audit Marketplace Launchspace also operates a marketplace where developers can offer their audit services. Audits play a crucial role in ensuring the security and reliability of dApps, and Launchspace provides a platform for developers to showcase and offer their auditing expertise to others in the Radix ecosystem. Overall, Launchspace's blueprint consists of an accelerator program to support dApp development on the Radix platform and marketplaces for Scrypto blueprints and audits. These initiatives aim to drive innovation, adoption, and success in the Radix ecosystem. Benefits There are several benefits of joining Launchspace as a developer building decentralized applications (dApps) on the Radix platform. These benefits include: Access to an Accelerator Program Launchspace provides an accelerator program specifically designed to support developers building dApps on the Radix platform. This program offers mentorship, technical assistance, and guidance to help accelerate the development process of dApps. Networking Opportunities By joining Launchspace, developers can connect with other developers, entrepreneurs, industry experts, and investors in the Radix ecosystem. This offers networking opportunities that can lead to new partnerships, funding, and growth opportunities. Marketplace for Scrypto Blueprints and Audits Launchspace provides a marketplace for Scrypto blueprints and audits. Developers can showcase their blueprints and offer their audit services to others in the Radix ecosystem, creating new revenue streams and opportunities for growth. Exposure and Visibility Joining Launchspace can help raise the profile of a developer's dApp, providing exposure and visibility to potential users and investors. Technical Resources and Support Launchspace can provide technical resources and support to developers when building their dApps on the Radix platform. This can help reduce technical challenges and shorten development timeframes. Overall, joining Launchspace as a developer can provide numerous benefits, including access to an accelerator program, networking opportunities, marketplaces for Scrypto blueprints and audits, exposure and visibility, and technical resources and support. Security Launchspace places a strong emphasis on security, particularly in relation to decentralized applications (dApps) built on the Radix platform. Here are some key aspects of security in Launchspace: Scrypto Blueprints Launchspace operates a marketplace for Scrypto blueprints, which are foundational elements for secure digital identities and transactions. By providing a platform for developers to share and sell these blueprints, Launchspace encourages the use of secure design patterns and best practices in dApp development. Audits Launchspace also serves as a marketplace for audits, ensuring that dApps built on the Radix platform undergo thorough security evaluations. These audits help identify and address potential vulnerabilities or weaknesses in dApps, providing a higher level of security for users and investors. Technical Support Launchspace offers technical support and guidance to developers building dApps on Radix. This support includes assistance with security-related aspects such as secure coding practices, encryption, authentication, and secure storage of user data. Collaboration and Knowledge Sharing By being part of Launchspace, developers have the opportunity to collaborate and share knowledge with other developers in the Radix ecosystem. This promotes the adoption of security best practices and helps the community stay vigilant against emerging security threats. Continuous Improvement Launchspace is committed to continuously improving security within its ecosystem. This includes staying up to date with the latest security practices, technologies, and industry standards to ensure a robust and secure environment for developers and users alike. ## Juicy Stake URL: https://radix.wiki/ecosystem/juicy-stake Updated: 2026-02-06 Summary: Juicy Stake is a provider of validator staking services on various Proof of Stake blockchains, including Radix. It aims to enable consumers to stake their crypt Juicy Stake is a provider of validator staking services on various Proof of Stake blockchains, including Radix. It aims to enable consumers to stake their cryptocurrencies through its validator nodes. Overview Juicy Stake Founded by Knox Trades, a Systems, Networking, and Cloud Architect. Juicy Stakes focuses on delivering high availability validator nodes to ensure a reliable staking experience for users. By delegating stake to Juicy Stake, stakers can earn rewards, with Juicy Stake offering 98% of staking rewards to its users while retaining a small 2% fee for node maintenance. Sources: Juicy Stake - Staking Guide (https://docs.juicystake.org/how-to/staking-guide) Juicy Stake - High APY Crypto Staking (https://www.juicystake.io/) Mission Juicy Stake validator's mission is to provide high availability validator nodes for users to stake their cryptocurrencies and earn rewards. Here are some key aspects of Juicy Stake's mission: Reliable Staking Services Juicy Stake aims to offer a reliable staking experience by maintaining high availability validator nodes. By delegating their stake to Juicy Stake, users can trust that their staking rewards will be generated consistently and securely. Competitive Rewards Juicy Stake is committed to providing competitive rewards to its users. They offer 98% of staking rewards to stakers while retaining only a small 2% fee for node maintenance. This ensures that stakers can earn a significant portion of the rewards generated through staking[4]. Transparency and Trust Juicy Stake believes in maintaining transparency and trust with its users. They provide information on the location of their validator node and adhere to the Radix Validators Code of Conduct. By being transparent about their operations, Juicy Stake aims to build trust among stakers and ensure a reliable staking experience. Additional Rewards Juicy Stake offers a monthly lottery where stakers have the chance to receive additional rewards. This adds an element of excitement and the opportunity for users to earn even more rewards from their staking activities. These elements contribute to Juicy Stake's mission of providing accessible and rewarding staking services, ensuring that users can make the most of their cryptocurrencies while maintaining trust and transparency. Staking Process The staking process with Juicy Stake validator involves a few steps to delegate your cryptocurrencies and start earning rewards. Here is a general overview of the staking process with Juicy Stake: Choose Supported Networks Juicy Stake offers staking services on various Proof of Stake blockchains, including Radix. Ensure that the cryptocurrency you wish to stake is supported by Juicy Stake. Wallet Setup Set up a compatible wallet that supports staking on the specific blockchain. For example, for Radix (XRD) staking, you can use the Radix Desktop Wallet. Acquire Cryptocurrency Obtain the cryptocurrency you want to stake. Make sure you have enough funds to meet any minimum staking requirements. Delegate Stake Follow the instructions provided by Juicy Stake on their official website or staking guide to delegate your stake. This typically involves accessing your wallet, selecting the validator option, and entering the Juicy Stake node address. Confirm Delegation Confirm the staking transaction through your wallet and wait for it to be processed on the blockchain. This confirmation may take some time, depending on the network. Staking Rewards Once your stake is delegated to Juicy Stake validator, you will begin earning staking rewards. The rewards you earn will be distributed based on the staking parameters and the specific blockchain's reward distribution mechanism. Monitor and Manage Stake Regularly check the status of your delegated stake and monitor your staking rewards. You may have the option to compound your rewards by restaking them or withdraw them, depending on the specific staking protocol. It's essential to follow the specific staking instructions provided by Juicy Stake and the guidelines of the chosen blockchain network to ensure a smooth staking process. Benefits There are several benefits to using Juicy Stake validator for staking on Proof of Stake blockchains. Here are some of the most important benefits: Higher Staking Rewards By delegating their tokens to Juicy Stake validator, users can earn staking rewards at a higher rate than they could by staking their tokens on their own. Juicy Stake validator achieves this by leveraging its advanced infrastructure and economies of scale. Secure and Reliable Infrastructure Juicy Stake validator offers a secure and reliable infrastructure for staking on Proof of Stake blockchains. Their infrastructure is built with redundancy and security in mind, ensuring that stakers' assets are always secure. Simple and User-Friendly Platform Juicy Stake validator provides a simple and user-friendly platform for stakers to easily delegate their tokens - even if they have little to no experience with staking. Monthly Rewards Lottery Juicy Stake validator provides an opportunity for users to participate in a monthly lottery and receive additional rewards on top of their regular staking rewards. Cost-effective Staking with Juicy Stake validator allows stakeholders to enjoy the benefits of staking without the need for expensive equipment or resources. Juicy Stake validator charges a 2% fee on staking rewards for node maintenance, making it an affordable option for many users. Overall, using Juicy Stake validator for staking on Proof of Stake blockchains provides several benefits, including higher staking rewards, secure infrastructure, user-friendly platform, monthly lottery, and affordability. Rewards Juicy Stake validator offers staking rewards to users who delegate their tokens to their validator. The specific details of the rewards structure can vary depending on the blockchain being staked. For example, on the Radix network, Juicy Stake offers stakers 98% of the staking rewards, while retaining 2% for node maintenance. This means that stakers would earn the majority of the rewards generated through the staking process. Security and Trust Juicy Stake validator aims to prioritize security and trust to ensure the safety of user funds and provide a reliable staking service. Here are some aspects that contribute to the security and trustworthiness of Juicy Stake validator: Infrastructure Juicy Stake validator operates with a robust and highly secure infrastructure. They employ industry best practices to protect their nodes and user assets. This includes implementing firewalls, encryption, and other security measures to safeguard against external threats. Experienced Team The team behind Juicy Stake validator consists of blockchain experts and professionals with experience in the industry. Their expertise helps in maintaining the security and reliability of the staking services provided. Transparency Juicy Stake validator emphasizes transparency by providing regular updates and communicating with their community. They are open about their operations and provide information regarding their infrastructure, staking rewards, fees, and any other important details that stakeholders need to know. Reputation Building and maintaining a solid reputation is crucial in the blockchain industry. Juicy Stake validator has established itself as a trustworthy and reliable staking service provider by delivering consistent rewards and maintaining the security of user funds. User Feedback A positive reputation is often built upon positive user experiences. Juicy Stake validator has received positive feedback and reviews from users who have utilized their staking services, demonstrating the trust that users have in their platform. ## Instabridge URL: https://radix.wiki/ecosystem/instabridge Updated: 2026-02-06 Summary: Instabridge is a regulated cross-chain bridge service that enables the transfer of digital assets between the Ethereum and Radix blockchain networks. Initially Instabridge is a regulated cross-chain bridge service that enables the transfer of digital assets between the Ethereum and Radix blockchain networks. Initially developed as a one-way bridge for converting eXRD tokens to XRD (https://learn.instabridge.io/article/signing-into-instabridge) , it has evolved into a comprehensive two-way wrapping service that supports multiple cryptocurrencies and digital assets. https://youtu.be/srRWQP4aDXw (https://youtu.be/srRWQP4aDXw) Overview Instabridge operates as a Virtual Asset Service Provider (VASP) and requires users to complete Know Your Customer (KYC) verification through its integrated Instapass identity verification system (https://learn.instabridge.io/article/how-do-i-add-a-new-radix-address) . This regulatory compliance framework enables Instabridge to facilitate secure and compliant cross-chain transactions while maintaining user privacy and security standards. Instabridge's core functionality includes the ability to wrap various ERC-20 tokens from the Ethereum network into their equivalent "x-assets" on the Radix network at a 1:1 ratio, minus applicable fees (https://learn.instabridge.io/article/does-instabridge-have-any-fees) . The platform integrates with popular wallet services, requiring the Metamask browser extension for Ethereum transactions and the Radix Wallet for Radix network operations. As of December 2024, Instabridge has processed over $100,000 in bridged assets (https://medium.com/instalabs-io/first-100k-of-xassets-bridged-onto-radix-01dc9fba13ca) including wrapped versions of major cryptocurrencies such as Bitcoin (xBTC), Ripple (xXRP), and Cardano (xADA). The platform employs a transaction status tracking system (https://learn.instabridge.io/article/new-instabridge-transaction-statuses) that provides users with real-time updates on their cross-chain transfers, including statuses such as "In Flight," "Completed," "Refunded," and "Failed." Recent platform upgrades have significantly improved (https://medium.com/instalabs-io/announcing-major-upgrades-to-instabridge-faster-smoother-and-more-tokens-b74dc6967296) transaction processing speeds and reduced fees, with transactions into the Radix network becoming free of charge and outbound transactions incurring a fee of 15 basis points, capped at $500. The service has also expanded its token support to include 20 additional cryptocurrencies, with plans to introduce retail functionality for these new assets in phases during Q1 2025. History Instabridge was developed by Metapass (Radix) Ltd. as a solution for bridging assets between the Ethereum and Radix blockchain networks. The service was initially launched as a one-way bridge (https://learn.instabridge.io/article/signing-into-instabridge) specifically designed to enable users to convert eXRD tokens to XRD at a 1:1 ratio. Following its initial release, the platform expanded its capabilities to become a two-way wrapping service, allowing users to bridge various digital assets between the two networks. This evolution included support for major cryptocurrencies such as ETH, wBTC, USDT, and USDC, with all conversions maintaining a 1:1 ratio minus applicable fees. In December 2024, Instabridge underwent significant upgrades (https://medium.com/instalabs-io/announcing-major-upgrades-to-instabridge-faster-smoother-and-more-tokens-b74dc6967296) that transformed its operational capabilities. These improvements included a 5x increase in transaction processing speed and a substantial reduction in delayed transactions. The platform also implemented a new fee structure, eliminating charges for transactions into the Radix network and reducing fees for outbound transactions to 15 basis points. The platform achieved a notable milestone (https://medium.com/instalabs-io/first-100k-of-xassets-bridged-onto-radix-01dc9fba13ca) in December 2024 when it surpassed $100,000 in bridged assets for various x-assets on the Radix network, including wrapped versions of Bitcoin, Ripple, Cardano, Bitcoin Cash, Stellar, and Ethereum. This achievement marked significant progress in establishing liquidity for these assets on the Radix network. The expansion continued with the announcement of support for 20 additional tokens, including popular assets such as BNB, XRP, ADA, TRX, AVAX, XLM, and AAVE. These new additions were initially made available exclusively to business users, with plans for phased retail access implementation during Q1 2025, coinciding with the anticipated launch of the Anthic (Anthic%20115dfc2a7e3d80cca262e6850bbff46e.md) ecosystem. Features Instabridge's primary feature is its token wrapping and unwrapping capability, which enables users to convert between ERC-20 tokens and their Radix network equivalents (https://learn.instabridge.io/article/swapping-wrapping-tokens-on-instabridge) . The platform designates wrapped assets on the Radix network with an "x" prefix, maintaining a 1:1 conversion ratio for all supported tokens. The platform integrates with multiple wallet systems, requiring users to have both the Radix Wallet app and the Metamask browser extension (https://learn.instabridge.io/article/how-do-i-add-a-new-radix-address) for full functionality. Users can manage multiple wallet addresses through their Instapass account, with the ability to set default addresses for both networks. Transaction monitoring is facilitated through a comprehensive status system (https://learn.instabridge.io/article/new-instabridge-transaction-statuses) that tracks transfers through various states: - "In Flight" status indicates active processing. - "Completed" confirms successful transaction completion. - "Refunded" signifies returned funds due to transaction issues. - "Failed" status indicates unsuccessful transactions with funds in holding. The platform implements strict security measures through its mandatory KYC verification system via Instapass (https://learn.instabridge.io/article/signing-into-instabridge) , which requires users to complete identity verification before accessing bridging services. This system includes AML (Anti-Money Laundering) tiers that determine transaction limits, with users able to apply for tier upgrades to increase their trading capacity. As of December 2024, the service has enhanced its performance metrics (https://medium.com/instalabs-io/announcing-major-upgrades-to-instabridge-faster-smoother-and-more-tokens-b74dc6967296) with a fivefold increase in transaction speed and significant reductions in processing delays. The platform maintains support for a growing list of digital assets, including major cryptocurrencies and stablecoins, with regular additions to its supported token list. Technical Architecture Instabridge's technical infrastructure is built around secure wallet integrations and automated transaction processing systems. The platform requires the Metamask browser extension for Ethereum-based operations (https://learn.instabridge.io/article/signing-into-instabridge) , using it to facilitate secure login authentication and transaction signing. For Radix network interactions, users must install the Radix Wallet Connector browser extension and mobile app (https://learn.instabridge.io/article/how-do-i-add-a-new-radix-address) . The platform implements a QR code-based wallet linking system that enables secure connection between users' browser sessions and mobile wallet applications. This system allows users to verify and manage multiple wallet addresses while maintaining security through cryptographic signing processes. Transaction processing is handled through a multi-stage system (https://learn.instabridge.io/article/new-instabridge-transaction-statuses) that monitors and verifies transfers across both networks. The architecture includes automated refund mechanisms for failed transactions, with funds being returned to their originating addresses when issues occur. The platform's identity verification architecture is integrated with Instapass, which manages KYC verification and AML compliance. This integration enables automatic wallet address verification (https://learn.instabridge.io/article/how-do-i-add-a-new-radix-address) and ensures regulatory compliance across all transactions. Recent technical improvements have resulted in significant performance enhancements (https://medium.com/instalabs-io/announcing-major-upgrades-to-instabridge-faster-smoother-and-more-tokens-b74dc6967296) , including optimizations that reduce transaction delays and improve processing speeds. The system architecture supports the platform's expanding token list while maintaining consistent 1:1 conversion ratios for all supported assets. Fees and Processing Instabridge employs a tiered fee structure based on transaction type and direction. The platform charges a 0.25% flat processing fee for all transactions (https://learn.instabridge.io/article/does-instabridge-have-any-fees) , with an exception for eXRD to XRD conversions. Following platform upgrades in December 2024, the fee structure was revised (https://medium.com/instalabs-io/announcing-major-upgrades-to-instabridge-faster-smoother-and-more-tokens-b74dc6967296) to eliminate fees for transactions into the Radix network, while outbound transactions incur a fee of 15 basis points, capped at $500. For Ethereum-originated transactions, users must maintain sufficient ETH in their accounts to cover Ethereum gas fees (https://learn.instabridge.io/article/does-instabridge-have-any-fees) . Similarly, Radix-originated transactions require XRD for network fees, which are processed through the user's Radix Wallet. When unwrapping tokens from Radix to Ethereum, gas fees are deducted from the final ERC-20 token amount received. The platform's processing system has been optimized for speed and efficiency. Recent upgrades have achieved a fivefold increase in transaction speed (https://medium.com/instalabs-io/announcing-major-upgrades-to-instabridge-faster-smoother-and-more-tokens-b74dc6967296) while significantly reducing the frequency of delayed transactions. The transaction monitoring system (https://learn.instabridge.io/article/new-instabridge-transaction-statuses) tracks all transfers in real-time, with status updates typically resolving within one business day. Processing limits are determined by user AML tiers, with higher tiers allowing for larger transaction volumes. Users can request tier upgrades through the Instapass system to increase their transaction limits. Regulatory Compliance Instabridge operates as a regulated Virtual Asset Service Provider (VASP), implementing comprehensive compliance measures through its integrated identity verification system. Users must complete KYC verification through Instapass (https://learn.instabridge.io/article/signing-into-instabridge) before accessing the platform's services, ensuring compliance with international regulations. The platform employs a tiered AML system that determines transaction limits based on user verification levels. Users connecting to Instabridge for the first time must agree to terms and conditions (https://learn.instabridge.io/article/signing-into-instabridge) and consent to personal data sharing for KYC and AML verification purposes. Transaction monitoring includes automatic compliance checks (https://learn.instabridge.io/article/swapping-wrapping-tokens-on-instabridge) that prevent transfers exceeding user AML tier limits. The system automatically blocks transactions that would exceed these limits, requiring users to upgrade their AML tier through Instapass before proceeding with larger transfers. For institutional users, Instabridge maintains additional compliance measures (https://medium.com/instalabs-io/first-100k-of-xassets-bridged-onto-radix-01dc9fba13ca) , including exclusive access to certain token pairs. The platform's expansion of supported assets includes phased implementation approaches that prioritize regulatory compliance while gradually extending access to retail users. Integration with Anthic Instabridge serves as a foundational infrastructure component for the Anthic platform, an upcoming decentralized finance (DeFi) service. The integration aims to enable real-time trading with low slippage (https://medium.com/instalabs-io/first-100k-of-xassets-bridged-onto-radix-01dc9fba13ca) by leveraging Instabridge's cross-chain liquidity capabilities. The platform's expansion of supported tokens (https://medium.com/instalabs-io/announcing-major-upgrades-to-instabridge-faster-smoother-and-more-tokens-b74dc6967296) is strategically aligned with Anthic's planned launch in Q1 2025. This expansion includes the addition of various digital assets, including stablecoins, memecoins, and alternative Layer 1 tokens, designed to create diverse liquidity options for the Anthic ecosystem. The integration is intended to combine centralized exchange efficiency with decentralized trading capabilities, utilizing Instabridge's streamlined asset bridging system to facilitate rapid and cost-effective transactions within the Anthic platform. ## Infinite Labs URL: https://radix.wiki/ecosystem/infinite-labs Updated: 2026-02-06 Summary: Infinite Labs is building an infrastructure service that empowers communities, to provide them the tools and services to bring their projects into a multiplayer Infinite Labs is building an infrastructure service that empowers communities, to provide them the tools and services to bring their projects into a multiplayer online environment. Overview The first Radix based infrastructure service that empowers communities, to provide them the tools and services to bring their projects into a multiplayer online environment, unlocking brand new opportunities. ## Impahla URL: https://radix.wiki/ecosystem/impahla Updated: 2026-02-06 Summary: Impahla is a smart contract-based digital asset platform, renowned for its specialization in both Primary Token Sales (including FT and NFT) via auctions and it Impahla is a smart contract-based digital asset platform (https://www.radixdlt.com/blog/runs-on-radix-q-a-impahla) , renowned for its specialization in both Primary Token Sales (including FT and NFT) via auctions and its secondary marketplace capabilities for NFTs. Besides its core functionalities, Impahla offers extended features including DAO governance tools, escrow services, and a unique lending app centered around NFT as collateral. https://youtu.be/kyNuXXFD6hg (https://youtu.be/kyNuXXFD6hg) Mission In an interview with RDX Works (https://youtu.be/kyNuXXFD6hg) , Jafaroff stated that mission is to usher in a new era of decentralized investment banking (https://www.radixdlt.com/blog/runs-on-radix-q-a-impahla) , unshackling projects from the confines of traditional finance mechanisms. This will be achieve by helping start-ups, associations, creators, and distributed ledger technology-related projects raise funds securely in return for various types of fungible and non-fungible tokens. The platform aims to provide easy access to novice investors, who often find current tools complex to navigate and thus are excluded from such innovative opportunities. Impahla addresses many fundraising and security inefficiencies present in first-generation decentralized launchpads and traditional finance fundraising platforms. It provides a single platform that caters to end-to-end lifecycle management of both fungible and non-fungible tokens. The platform's algorithms for price discovery, fair access allocation, and aftermarket allocation utilize the latest DLT to create a better experience for all participants. Platform Impahla is designed to offer next-generation features that enable founders, investors, and users to: - Raise funds. - Tokenize companies. - Invest in tokenized companies. - Govern tokenized companies. - Access a liquid secondary market for trading in and out of the portfolio of tokens created on the launchpad. - Enjoy a rich set of functionalities and services to support their investment experience. Fundraising and NFT Launches Impahla facilitates the fundraising and asset launching process for crypto-native enterprises and NFT projects, addressing the limitations of traditional fundraising, such as high fees, geographical restrictions, and excessive paperwork. The platform aims to eliminate these hurdles by leveraging the power of Radix DLT technology, democratizing the capital raise process, and creating synergy between projects and communities in a legally secure environment. Impahla serves as a decentralized NFT launchpad for the Radix ecosystem, offering a secure outlet for creators to launch and sell assets that may not have a market elsewhere. The platform enables creators to mint and launch various types of NFT products, such as real-life collectibles, videos, car titles, event tickets, and more. The platform also has a long-term mission of bridging decentralized finance (DeFi) with the real economy by onboarding off-chain enterprises into the crypto economy and assisting them in integrating with decentralized communities. Impahla envisions itself as a prominent fundraising platform not only for on-chain companies but also for real economy enterprises that wish to tokenize their equity or issue utility tokens and benefit from innovative decentralized governance tools. Auction Models Impahla offers multiple auction models, such as Dutch auction and second price auction, to ensure founders achieve fair valuations and targeted fundraising for their ventures and assets. Asset Types Impahla helps crypto-native enterprises and NFT creators with the issuance of various asset types, including: - Governance tokens: Allowing Crypto Native Enterprises (CNEs) to issue their governance tokens - Utility tokens: Granting rights related to products or services, such as participation in online games or decision-making within a project - Hybrid tokens: Combining governance and utility functions - NFTs: Any NFT product DAO Dynamics Impahla envisions itself as a Decentralized Autonomous Organization (DAO), a community-led entity with no central authority that is fully autonomous and transparent. Smart contracts lay the foundational rules and execute agreed-upon decisions. Incentive alignment and successful collaboration among all stakeholders and participants are crucial for a successful DAO. Impahla DAO aims to enable fluid collaboration and provide flexibility for high-quality talent to contribute to the Impahla ecosystem consistently. To attract and retain high-quality talent who add value to Impahla DAO, the platform has allocated 20% of its tokenomics to "DAO Incentives." These incentives will be distributed among participants based on their impact and contribution to the DAO. Community Focus Impahla aims to build a robust and successful community within the Radix ecosystem, with a shared vision and passion. The platform seeks to create a space where people with common interests can come together, learn, and grow around a shared vision. To achieve this, Impahla has allocated 15% of its tokenomics to Public sales and 15% to airdrop campaigns. Half of the airdrop allocation will go to Impahla NFT holders, while the other half will go to Radix Stakers, promoters of Impahla, active community members, and supporters. Team Impahla was conceived as a DAO, projecting a future dominated by community-led operations underpinned by code. The project's inception can be attributed to a synergic partnership formed within the Radix ecosystem in 2021. The core team members include: - Matt Croq: A veteran developer with a 12-year experience, Croq is integral to Impahla's technical framework. - Eric: With three years in the field, Eric specializes in front-end development and has a notable third place in the Scrypto Challenge. - Jafaroff: As a product development and marketing expert with 13 years of professional experience, Jafaroff plays a pivotal role within Impahla. - Rachid: A seasoned finance professional, Rachid specializes in product strategy and business development. - Miro: Originally from the Radix community, Miro's expertise as a mobile app developer bolsters Impahla's aspirations in the mobile domain. Tokenomics Impahla tokenomics (https://medium.com/@impahla/impahla-tokenomics-95d879028bfe) is designed around two critical elements: DAO dynamics (incentive alignment) and the Community. Token Allocation - 20% - DAO Incentives - 15% - Public Sales - 15% - Airdrop Campaigns - 18% - DAO Reserve The DAO Reserve, comprising 18% of the token allocation, will be spent only through community voting. These reserve funds are allocated for the ecosystem's future growth, and the community will decide their usage. The platform has not yet decided on the timeline for private and public sales, as their primary focus is on building the initial version of the product. Allocation among public and private sales may change as Impahla evolves and grows with the community. Any modifications or decisions regarding private or public sales will be made through community consultation and voting. Choice of Radix Impahla's decision to build on Radix is multi-faceted: - Full Stack Approach: Radix's from-the-ground-up approach echoes Impahla's zeal for innovation. - Community Alignment: The shared ethos of fostering a genuine and engaged community binds the two platforms. - Coding Language: Radix’s Scrypto language simplifies the development process for Impahla. - Radix's Work Ethos: Radix's emphasis on technical accuracy and a product-centric approach resonates with Impahla's own philosophy. ## Ideosphere URL: https://radix.wiki/ecosystem/ideosphere Updated: 2026-02-06 Summary: Ideosphere is a crowdfunding platform specifically designed to support scientific research and open source projects. The platform operates on a monthly membersh Ideosphere is a crowdfunding platform specifically designed to support scientific research and open source projects. The platform operates on a monthly membership model, similar to services like Patreon, where supporters can contribute financially to individual researchers or research groups of their choice. Mission Ideosphere's mission is twofold: to provide researchers with a consistent and accessible funding stream, and to create a global community where science enthusiasts can engage directly with cutting-edge research. Platform Ideosphere's platform is designed to cater to two primary user groups: researchers and supporters. The features and functionalities are tailored to meet the needs of each group, fostering a symbiotic relationship between those conducting scientific research and those passionate about supporting it. For Researchers Ideosphere offers several key benefits for researchers seeking funding and exposure for their work: - Global Science Outreach: Researchers can engage with a worldwide network to share their research, inspiring scientific curiosity and gaining support from an international community dedicated to STEM advancements. This global platform allows for increased visibility and potential collaborations across borders. - Consistent Revenue Stream: Through the monthly membership model, researchers can secure a more stable and predictable source of funding. This consistent income can provide financial stability for ongoing research projects, allowing for better long-term planning and resource allocation. - Direct Funding Access: Ideosphere simplifies the funding process by connecting researchers directly with supporters. This approach bypasses traditional grant processes, which can be time-consuming and highly competitive. Researchers can access funds from a dedicated community passionate about advancing STEM, potentially accelerating the pace of their work. For Supporters Ideosphere provides supporters with various ways to engage with and contribute to scientific research: - Exclusive Content: Supporters gain access to special content from the researchers they fund. This may include participation in Q&A sessions, early access to research findings, and the ability to collect badges showcasing their involvement. This exclusive access creates a more personal connection between supporters and the research they're funding. - Likeminded Community: The platform fosters communities of people passionate about science and innovation. Supporters can join these groups to share ideas, discuss research, and connect with others who have similar interests in scientific advancement. - Monthly Memberships: Supporters can empower scientists through flexible, subscription-based memberships. These memberships can be adjusted or cancelled at any time, providing convenience and control over their level of support. - Proof of Donation NFTs: In a unique integration of blockchain technology, supporters receive Non-Fungible Tokens (NFTs) as proof of their contributions. These NFTs feature custom art and designs related to the research projects they support, serving as a digital record of their involvement in advancing scientific research. ## IdeoMaker URL: https://radix.wiki/ecosystem/ideomaker Updated: 2026-02-06 Summary: Ideomaker is a decentralized application platform that employs a no-code approach, intending to provide a multifaceted set of tools for businesses and individua Ideomaker is a decentralized application platform that employs a no-code approach, intending to provide a multifaceted set of tools for businesses and individuals. The platform leverages the advantages of the Radix Decentralised Ledger Technology (DLT) for various functionalities such as interactive planning tools, financial tools, transparency, and security of the working process, use of IoT, robots, and automated services. https://youtu.be/3Wd-mxpU60M (https://youtu.be/3Wd-mxpU60M) Services Ideomaker offers several services and features to its users: - Interactive Planning: Ideomaker offers an intuitive visual planning interface for users to visualize the production process, from ideation to execution. This interactive planning system helps users manage product development and distribution efficiently, with a clear overview of all stages of the process. - Global Reach: Ideomaker's geolocation features facilitate global collaboration. It uses smart matching algorithms to find potential partners based on diverse criteria such as pricing, delivery times, reputation, and quality. This ensures users can find the right support for development, distribution, or consultation, irrespective of location. - Open Ledger: Ideomaker utilizes a Decentralised Open Ledger, allowing for full transparency in product development. Users can monitor every aspect of their product's lifecycle, including manufacturing, transportation, and delivery. This ledger also helps to confirm the authenticity of products, and gives all stakeholders transparent access to a tamper-proof database. - IOT Connected: Ideomaker is designed to integrate and leverage Internet of Things (IoT) devices to enhance efficiency and reduce costs. This functionality prepares users for a future where automated tasks and services become the norm, and repetitive jobs are delegated to robots and other automated devices. - Team Projects: Ideomaker provides a global network of professionals that users can interact with to form their product development teams. The platform offers secure and reliable communication tools for effortless collaboration and synchronization of teamwork. - Automated Workflow: Ideomaker places a strong emphasis on automation. All its modules are designed to work together seamlessly, helping users improve manufacturing cycles, speed up delivery times, and reduce warehousing inefficiencies. - Marketplace & Tools: Ideomaker offers a modular framework of specialized tools. Users can choose from existing tools or create their own based on their unique needs. This framework is designed for intuitive use and seamless interaction. - In-App Store: Ideomaker features an in-app store for both end customers and businesses. Third-party developers can integrate their products and services into the store, which are then made interoperable across the whole platform without manual adjustments. Payment and Economics Ideomaker accepts its native token and also the native $XRD token of the Radix DLT platform that it's built on. Gradually, various cryptocurrencies will be accepted to provide users with as many opportunities for expansion and adoption as possible. In addition to digital payments, Ideomaker plans to spread the adoption of its services via Card and NFC payments and POS systems. Target Audience Ideomaker is designed for a wide range of creators, including designers, developers, freelancers, artists, small to big brands, or hobbyists. It also serves suppliers, both local and international, from small to large companies. Team Ideomaker is run by a small team of entrepreneurs with design and software development skills. The team operates from Rotterdam, Netherlands and Brasov, Romania, and they are open to expanding it with other skilled professionals from all over the world. The known members of the team are Vlad Butucariu, who is responsible for UI, UX, Dapp Design, and Rares Lupascu, who is involved in Software Development. ## Ice URL: https://radix.wiki/ecosystem/ice Updated: 2026-02-06 Summary: Ice is an experimental non-fungible token (NFT) project implemented on the Radix blockchain using RRC-404 - an implementation of the ERC-404 standard built with Ice is an experimental non-fungible token (NFT) project implemented on the Radix blockchain using RRC-404 (RRC-404%206b332b8a250248c59593c0da405da477.md) - an implementation of the ERC-404 standard built with Scrypto (https://www.notion.so/Scrypto-Programming-Language-2d69fe4f8bf24d6491c7af567faecaaa?pvs=21) , Radix's asset-oriented smart contract programming language. https://youtu.be/T55DUn6QLgQ (https://youtu.be/T55DUn6QLgQ) Overview The key innovation of ICE is allowing tokens to transition between fungible and non-fungible states through "freeze" and "melt" operations. This enables combining the benefits of NFTs (unique digital artifacts) and fungible tokens (enabling fractional ownership and trading). The maximum combined supply of ICE across fungible and non-fungible tokens is capped at 1000. The proportions of different NFT rarities are dynamically maintained based on the current circulating supply of NFTs when new ones are minted or melted back into fungibles. Features - Maximum 1000 total tokens (NFTs + fungibles) - Convert between fungible and NFT via "freeze" and "melt" operations - Constant percentage of each NFT rarity maintained - 1 XRD royalty charged per freeze/melt operation Benefits - NFTs can provide liquidity to DEXs like Ociswap (Ociswap%20db98af5ef73e4c41b0014f6dc05fb132.md) via their linked fungible tokens - DEX liquidity increases overall market liquidity and price discovery - Fractional ownership enabled by holding fungible token amounts - Incentivizes NFT creators via royalty stream The ICE project demonstrates new possibilities opened up by Radix's unique asset-oriented programming model, aiming to enhance NFT liquidity, ownership models, and monetization for creators. ## Hug URL: https://radix.wiki/ecosystem/hug Updated: 2026-02-06 Summary: Hug ($HUG) is a meme coin launched on the Radix network by the community. Their mission is to bring peace and happiness to everyone. Hug ($HUG) is a meme coin launched on the Radix network by the community. Their mission is to bring peace and happiness to everyone. https://youtu.be/AZnQ3v9BoMk?si=0V3PszQ1_Q8RGby4 (https://youtu.be/AZnQ3v9BoMk?si=0V3PszQ1_Q8RGby4) How to buy $HUG Ociswap - https://ociswap.com/ (https://ociswap.com/) Caviarnine - https://www.caviarnine.com/trade (https://www.caviarnine.com/trade) Roadmap - Token launch / DEX listings (complete) $HUG token was fairly launched on the 30th of December 2023 on Ociswap and then Caviarnine. No presale, no ICO, same price for all. - Website launch (complete) Every self-respecting meme should have its own website. Now it's done with this great place where the $hug fam can find all the info they need about the project. - Coingecko listings Because our mission is to hug the world, the world needs to find $HUG. - HUGfunding (coming soon) A decentralized protocol that allows users to contribute to an NFT fundraiser. If the fund goal is met the funds are locked and sent to the desired recipient. If the goal is not met funds are redistributed back to the original funders. - Hug bot (coming soon) We believe not only that everyone needs a hug, but that everyone should give a hug. Even robots. That’s why we want to program one that can be used to give hugs in an efficient and friendly way on telegram. Beep boop. - Proof-of-Hug Our unique, and never before seen distribution method. Community members and people of the world alike will be rewarded for simply deserving a hug. It’s truly a revolution. - HalveHUGgening The amount of HUG distrubuted will be cut by 50% each year. Although the number of yearly hugs being distributed will decrease, it’s rumored that each one will be jam packed with more love too. Wow. Much Hug. So love. - NFT Collection While we are not ready to publicly announce any of the details about our NFT collection, one thing has been made abundantly clear. Our community demands cute and cuddly NFTs; and they hug really hard. We are scared of them. - Hug tales A collaborative story project, about the origins and life of HUG - Hug the world ❤️ There’s a great big world out there, and our goal is to HUG and help as many people as we possibly can. ➕  Much more to come... As the project is young and driven by the community, much more features and phases are planned (marketing, listings ...) but not indicated on the roadmap. Please note that the roadmap is subject to change. ## Hermes Protocol URL: https://radix.wiki/ecosystem/hermes-protocol Updated: 2026-02-06 Summary: Hermes Protocol is a Web3 notification and communications platform developed on the Radix network, designed to facilitate connections between businesses and Web Hermes Protocol is a Web3 notification and communications platform developed on the Radix network, designed to facilitate connections between businesses and Web3 users while preserving privacy and anonymity. https://youtu.be/QCXViOacn4I (https://youtu.be/QCXViOacn4I) Overview Hermes Protocol delivers real-time notifications for critical on-ledger events, including new Governance Polls, NFT sales, Validator changes, and more. The platform allows dApps to send newsletters without requiring users to disclose their email addresses, thus promoting privacy and anonymity in the Web3 space. Hermes Protocol is compatible with popular platforms like Discord, Telegram, and Twitter, aiming to optimize user experience for both end-users and business partners. History The idea for Hermes Protocol emerged during Terra's second hackathon, where co-founders Sérgio, Ana, and Duarte combined their Web3 knowledge to create a notification platform capable of sending on-chain notifications to Discord and Telegram users. Founders Sérgio Rebelo Sérgio has experience in online communities and software automation. His past positions include roles at Aptoide, Talkdesk, and AXA. He also co-founded a gaming events company in Portugal and worked with AXA in Belgium. https://youtu.be/OQoDNalYf8k (https://youtu.be/OQoDNalYf8k) Ana Ana is skilled in data analysis, fencing, and logical thinking. She holds a Master's degree in Management and has worked as a risk modeler at various European banks. Ana is proficient in several programming languages and is a front-end developer. Duarte Duarte is a full-stack developer with an interest in bots and automation. He met Sérgio in the "Among Us Portugal" online gaming community and contributed to the development of a community bot that streamlined gaming sessions. Project Goals Hermes Protocol aims to reach 100,000 unique users within one year following Babylon's launch. The team plans to increase the user base by partnering with wallet providers such as Xidar, Z3us, Stream, and the official Radix Wallet. Additionally, Hermes Protocol is focused on business expansion, targeting 200 paying businesses through various marketing channels. Competitors Hermes Protocol faces competition from projects like Push (previously called EPNS), Notifi, Dialect, Mailchimp, and SendGrid. However, Hermes Protocol's business model, which does not charge end-users, may be more suitable for long-term adoption and Web3 integration. ## Gable Finance URL: https://radix.wiki/ecosystem/gable-finance Updated: 2026-02-06 Summary: Gable Finance (previously: Sundae Finance) is a decentralized liquidity market protocol being developed on Radix. It was founded by Joost Delaere in Amsterdam, Gable Finance (previously: Sundae Finance) is a decentralized liquidity market protocol being developed on Radix. It was founded by Joost Delaere (https://www.linkedin.com/in/joostdelaere123/) in Amsterdam, Netherlands in 2023. The protocol offered flash loans, allowing users to borrow funds without collateral if the borrowed funds are returned within the same transaction. This feature serves a wide variety of purposes including trading, arbitrage, and others. https://youtu.be/NEtdvqDkWss?si=ayftgDxrJ99TjrbF (https://youtu.be/NEtdvqDkWss?si=ayftgDxrJ99TjrbF) Etymology In a Medium post (https://gablefinance.medium.com/gable-cddeaeed139c) , Joost Delaere wrote that the name ‘Gable’ was inspired by the steeped gables that are characteristic of the architecture in Amsterdam, his native home. History Gable was conceived in early 2023 by Joost Delaere, a software developer with a background in quantitative finance and credit risk model implementation. Drawing from his personal interest in cryptocurrency and decentralized finance, Delaere aimed to build a platform where traders and investors could meet and bring liquidity to the forefront of decentralized finance. The first iteration of the Gable will be released shortly after the launch of Radix Babylon in August 2023, offering a validator node, liquidity pool, and flash loans. Features Gable allows users to participate as borrowers or suppliers. For suppliers, it involves staking $XRD at Gable's validator node, depositing Liquid Staking Units (LSU's) into the liquidity pool, earning interest, and finally, claiming resources. A unique Non-Fungible Token (NFT) serves as proof of the supply, allowing the protocol to keep track of the user's entitled amount of LSU and earnings. For borrowers, the protocol offers flash loans - a feature that enables users to access liquidity instantly at low interest rates, thus facilitating profitable trades and taking advantage of market inefficiencies. This has made it possible for anyone to participate in DeFi markets in a more accessible and cost-effective way. Future Plans Post the launch of Gable Version 1, plans for user adoption and increased liquidity have been prioritized by utilizing social media channels and promoting Gable's validator node to $XRD holders. The following year, in 2024, the protocol is planned to expand by introducing a native governance/utility token, increasing the number of lending products, supported tokens, liquidity pools and methods for providing liquidity. The Gable team applied for a Radix Babylon Booster grant. Security The protocol is non-custodial and operates under the governance of smart contracts, offering transparency, security, and trust among all participants. Only the rightful owner can claim their portion of the pool's liquidity. ## Fidenaro URL: https://radix.wiki/ecosystem/fidenaro Updated: 2026-02-06 Summary: Fidenaro is a decentralized social trading platform developed on the Radix network that allows users to follow and copy the positions of experienced cryptocurre Fidenaro is a decentralized social trading platform developed on the Radix network that allows users to follow and copy the positions of experienced cryptocurrency traders. Etymology ‘Fidenaro’ is a neologism but could be roughly translated as ‘A place of trust’ in Latin. History Fidenaro was founded in late 2022. The full launch of the platform is planned for the first quarter of 2024, at which point traders and investors will be able to use the platform to its full extent and with real money. Founders - Andreas: (https://twitter.com/ThanosOfCrypto) Founder | Development - Erik: (https://twitter.com/erik_fdn) Founder | Marketing and Communications - Jan: (https://twitter.com/non___system) Process and Automation - Tobias: (https://twitter.com/Toby_XOXO_XRD) Founder | Marketing and Organization Features - Profit from Expertise: Users can harness the knowledge and time of expert traders. - Mutual Benefits: Fidenaro’s design ensures that both traders and investors derive benefits. While traders share their expertise to potentially boost investor profits, they, in turn, receive a portion of the gains. - Smart Leveraging: The platform offers tools that may help enhance user profits through informed leveraging decisions. - Security: The platform provides a transparent, controlled environment bolstered with the latest technological advancements for utmost security. - Scalability: Fidenaro's underlying foundation, the Radix network, ensures that even during peak usage, transaction fees remain affordable. - Transparency: The platform bolsters trust by making traders' trading histories and performances public. This aids investors in making informed decisions. For Traders: Proficient traders have the facility to augment their profits using margin. By sharing their trading strategies and outcomes, they can attract other Fidenaro users. If their strategies are convincing and their performance commendable, they might attract capital from other users to trade with. In this partnership, traders earn a fraction of the returns made on these investments. For Investors: Investors can navigate market volatility by banking on the expertise of Fidenaro's traders. The platform assists in monitoring each trader's strategy and success, creating the necessary transparency for investment decisions. Barring the trader's fee, profits are directly transferred to the investor's wallet upon selling. $FDN Token In 2024, Fidenaro plans to launch the $FDN Token. This will give users the opportunity to invest in Fidenaro and benefit from the platform's development. Users will also be able to earn $FDN Tokens through various actions in which they support Fidenaro. Roadmap - Q4, 2023 | Beta Launch - SAPPHIRE: The open beta launch will enable users to acquaint themselves with Fidenaro's features, all while using virtual money to ensure risk-free testing. - Q2, 2024 | Fidenaro Launch - EMERALD: With the platform's official launch, both traders and investors can fully utilize Fidenaro with actual money. - 2024 | FDN Token Launch - DIAMOND: The introduction of the FDN token will present users with investment opportunities in Fidenaro. Additionally, users will have the chance to earn FDN tokens through various supportive actions towards Fidenaro. ## Fibonacci Finance URL: https://radix.wiki/ecosystem/fibonacci-finance Updated: 2026-02-06 Summary: Fibonacci Finance is a data-driven risk management suite designed to provide visibility on the financial risks associated with digital assets. Fibonacci Finance is a data-driven risk management suite (https://www.fibonacci.fi/) designed to provide visibility on the financial risks associated with digital assets. https://youtu.be/YCHtfgkbfQM?si=cunD7pmAA1AdKHWC (https://youtu.be/YCHtfgkbfQM?si=cunD7pmAA1AdKHWC) Overview Fibonacci Finance enables clients to develop their own applications using APIs or to set up in-house risk engines swiftly. The platform is particularly beneficial for: - Token Projects: It assists in optimizing on-chain liquidity and facilitates token listing on more lending platforms as collateral. - DeFi Protocols: Offers tools to optimize risk engine design throughout the lifecycle of a DeFi protocol. - Traders and Institutions: Incorporates advanced financial risk analysis into the due diligence process to capture opportunities and mitigate potential on-chain liabilities. The project aims to simplify the process of risk assessment for digital assets, working across various distributed ledgers. It includes a suite of tools such as liquidity monitoring dashboards, custom risk engines for perpetual exchanges, and decentralized finance (DeFi) risk engines. Services and Features Liquidity Dashboards Fibonacci Finance's liquidity dashboards are essential for any token trading on-chain, providing real-time tracking of Total Value Locked (TVL), volume, slippage, and capital efficiency. Risk Models The platform addresses the shortcomings of current DeFi risk management by providing instantaneous, real-time, data-driven approaches that simplify quantitative analysis. Risk Management for Different Company Stages - Seed Companies: Can kickstart their risk engine using Fibonacci's APIs and request bespoke services if needed. - Growth Companies: Manage growth and liquidity by enhancing existing engines with Fibonacci data and bespoke dashboards. - Mature Companies: Transition to data-driven risk management and automate risk processes with enhanced internal models. Real-time Incentive Optimization Fibonacci Finance offers real-time calculation of key risk parameters, basic to advanced risk metrics, and quick protocol risk evaluation, with the flexibility to customize and adjust risk parameters in real-time. The Radix Connection Fibonacci Finance has been featured in a Q&A on Radix's blog (https://www.radixdlt.com/blog/runs-on-radix-q-a-fibonacci-finance) , highlighting its specialized risk management solutions for DeFi projects, NFT ventures, and DAOs. The platform's APIs allow users to analyze financial risks associated with Radix assets, filling a gap in affordable and effective risk management tools in the DeFi sector. Co-founders Shimon and Samiar have a vision to make DeFi a robust alternative to traditional finance. They have chosen Radix for its native-layer composability, security, and DeFi-centric approach, which augments Fibonacci's functionalities and aligns with transformative DeFi projects. Roadmap and Future Plans Fibonacci Finance aims to support every burgeoning DeFi ecosystem, with plans for major chain expansion, successful fundraising, and pioneering financial risk models. Their roadmap is set to establish Fibonacci as an indispensable risk management tool in the DeFi arena. Fibonacci Finance has integrated major assets post Radix’s Babylon (https://www.notion.so/Radix-Mainnet-Babylon-2cdc168d56e04787a9f412c23560c4b0?pvs=21) update into its suite of APIs, allowing developers to access complex data regarding token composition and financial risk metrics easily. This integration empowers Radix DeFi projects to incorporate advanced financial risk data, aiding in tasks such as managing liquidations and visualizing slippage for users. Safety and Accessibility While the tools provided by Fibonacci Finance are designed to enhance safety by providing robust data for risk management, the implementation and effective use of these tools are up to the Radix community and other users. Founders and Team Shimon, alongside his co-founder Samyar, who has a background in international business and finance, established Fibonacci Finance. Their collective experience includes a previous DeFi project on Solana and nearly six years of collaboration on various ventures. Clientele and Grants The company has sustained itself through client acquisition and grants without the immediate need for external funding. It has received grants from several major blockchain chains and has garnered a diverse client base from these platforms. Future Outlook and Integration with Radix At the time of the source interview, the integration with Radix was anticipated to be already live. The team at Fibonacci Finance was closely monitoring new tokens in the Radix ecosystem, aiming to expand their risk metrics services. ## Farbocoin URL: https://radix.wiki/ecosystem/farbocoin Updated: 2026-02-06 Summary: Farbocoin ($FARBO) is a cryptocurrency developed on the Radix Network. It aims to offer a unique approach to crypto by prioritizing user-friendliness and entert Farbocoin ($FARBO) is a cryptocurrency developed on the Radix Network. It aims to offer a unique approach to crypto by prioritizing user-friendliness and entertainment. https://youtu.be/-7CzzNjvA2o (https://youtu.be/-7CzzNjvA2o) Overview The project revolves around Farbo games, where users can participate and earn $FARBO tokens. The Farbo World game has already reached a significant milestone of over 10,000 downloads and has a growing fanbase.Farbocoin also plans to offer social media airdrops, tournaments, and betting on matches as additional ways to earn tokens. By focusing on fun and engagement, Farbocoin aims to create a vibrant ecosystem where users can earn and utilize $FARBO tokens. The project recently announced a presale in collaboration with a trustee to ensure transparent handling of the token sale. Tokenomics Farbocoin is a decentralized digital currency designed to facilitate instant and secure transactions. Farbocoin's tokenomics include the following initial distribution plan: - Airdrops and Rewards: 50% (50,000,000 $FARBO) - Presale: 25% (25,000,000 $FARBO) - Team / Marketing / Development: 20% (20,000,000 $FARBO) - Advisers: 5% (5,000,000 FARBO) Farbocoin has a maximum supply of 100,000,000 $FARBO. The team aims to develop a self-sustaining ecosystem where Farbocoin can be used as a means of payment for goods and services within the platform. It is crucial to keep in mind that tokenomics is a complex subject, and a token's value is determined by various factors such as demand and supply, adoption rate, network activity, and token utility. Product Farbocoin is a cryptocurrency that focuses on bringing fun and user-friendliness to the world of crypto.It offers an engaging experience through participation in Farbo games and tournaments, allowing users to earn $FARBO. The project aims to create an accessible platform for people to get involved in the crypto space, especially for those who may find other platforms intimidating or complex. One of the notable games associated with Farbocoin is Farbo World, which has surpassed the milestone of 10,000 downloads and has a growing fanbase. Farbo World is a colorful puzzle adventure game with a multiplayer mode that adds a twist. In addition to playing games, users can also earn $FARBO through social media airdrops, participating in tournaments, and even betting on matches. The project aims to create a vibrant ecosystem where users can engage with Farbocoin in various entertaining ways. Farbocoin is built on the Radix Network, which provides a highly scalable and secure platform for blockchain-based applications. The Radix platform supports smart contracts and features such as atomic swaps for precise execution of transactions Platform Farbocoin is built on the Radix Distributed Ledger, which is designed to offer a highly scalable, secure, and decentralized platform for building blockchain-based applications. Unlike traditional blockchain platforms like Bitcoin and Ethereum, which rely on a blockchain's linear structure, Radix uses a unique data structure called a "temporal graph" to enable parallel processing and near-instant transaction finality. The Radix platform also supports smart contract development, allowing developers to create and deploy complex decentralized applications (dApps). Additionally, it offers several features to address security concerns, such as atomic swaps to ensure precise execution of transactions. Farbocoin leverages the Radix platform's features to provide a user-friendly cryptocurrency experience, centered around in-game activities that offer users a fun, engaging way to earn FARBO tokens. Security and Trust Regarding the security and trust of Farbocoin, it is important to note that the trust and security of any cryptocurrency project depend on various factors such as the underlying blockchain technology, network consensus mechanisms, and community engagement. Farbocoin is built on the Radix Network, which is designed to provide high scalability and security for blockchain-based applications.The Radix platform uses advanced technologies to ensure the integrity and security of transactions, including elements such as Merkle trees, distributed consensus algorithms, and cryptographic encryption. These features contribute to the overall security of Farbocoin transactions and the trustworthiness of the platform. Furthermore, being developed on the trusted Radix network provides a level of assurance in terms of reliability and stability. The Radix platform has a reputation for its focus on security and has undergone rigorous testing and auditing processes to ensure the robustness of its technology. To further enhance security, it is always recommended for users of Farbocoin or any cryptocurrency to follow best practices in securing their own wallets, such as using strong passwords, enabling two-factor authentication, and keeping their private keys offline and secure. It's worth mentioning that the trust and security of any digital asset can also be influenced by how well the project is governed and the transparency of its operations. As such, it is important for users to research and evaluate the Farbocoin project and its team to assess their commitment to security and trust. Sources https://radixecosystem.com/projects/farbocoin (https://radixecosystem.com/projects/farbocoin) https://www.reddit.com/user/Farbo_Official/comments/x6s36a/farbocoin_the_journey_begins/ (https://www.reddit.com/user/Farbo_Official/comments/x6s36a/farbocoin_the_journey_begins/) https://farbo.me/Farbo_brochure.pdf (https://farbo.me/Farbo_brochure.pdf) https://farbo.me/ (https://farbo.me/) ## EtherealDAO URL: https://radix.wiki/ecosystem/etherealdao Updated: 2026-02-06 Summary: EtherealDAO is a Decentralized Autonomous Organization (DAO) that is developing the EtherealUSD stablecoin protocol on Radix. The initiative is inspired by proj EtherealDAO is a Decentralized Autonomous Organization (DAO) that is developing the EtherealUSD (EtherealDAO%201a6e0ce04b34486a8687d4997e00f2e1.md) stablecoin protocol on Radix. The initiative is inspired by projects like MakerDAO and Liquity on the Ethereum network. By taking the lessons learned from these decentralized financial (DeFi) primitives, EtherealDAO seeks to innovate in the smart contracts and decentralized stablecoin space. https://twitter.com/EtherealDAO/status/1665509659824758784?s=20 (https://twitter.com/EtherealDAO/status/1665509659824758784?s=20) Overview EtherealDAO, governed by the $REAL governance token, aims to establish a CDP-type stablecoin protocol that brings stability to the Radix network ecosystem. It features unique mechanisms designed to maintain a tight peg to the U.S. dollar. The DAO uses the PoS staking rewards incentive structure native to the network to drive development and maintain the ecosystem's health. EtherealUSD The design of EtherealUSD is unconfirmed but the Litepaper (https://ethereal.systems/EtherealUSD_Litepaper.pdf) proposes using a system called Ethereal Collateralized Debt Positions (ECDPs). ECDPs is a balance sheet of assets to liabilities, a user-owned position with market-making capabilities via a function called "Mandatory Pegging". This system focuses on a single-collateral, single-liability system for V1, named EtherealUSD. The system derives a Collateral Ratio (CR) from ECDP, which is the value of assets over liabilities. If the CR goes under the Minimum Collateralization Ratio (MCR), the ECDP is liquidated. The liquidation process involves a Dutch auction to sell assets of the ECDP, and any difference between sold and total assets goes to the Treasury. "Mandatory Pegging" is a module that allows anyone to either mint or burn the stablecoin directly against the system without creating an ECDP. This module ensures that the peg of the stablecoin is maintained in the market. The system operates in different "System Modes" depending on the Total Collateral Ratio (TCR), which is the total assets in all ECDPs over total liabilities. Various percentage values (RecoveryPoint, BackstopPoint, ConvexityPoint, and CriticalPoint) act as triggers for the system to change its behavior. These system modes provide mechanisms to deal with different market conditions and potential attacks. Motivations for these design choices are to address issues seen in other DeFi protocols, such as inefficiencies in capital usage, maintaining a tight peg, and handling market volatility in a smoother manner. Initial Validator Offering (IVO) One of the innovative methods employed by EtherealDAO is an Initial Validator Offering (IVO). This approach enables DAOs to crowdsource support for projects contributing to the network while advancing the ecosystem. It operates by the DAO registering a validator and calling upon community members to delegate their stake with them. The variable fee is adjusted to 100%, effectively redirecting all emission rewards to the DAO treasury for later use in incentivizing protocol participation and running operations. As a reward for their support, delegators receive the project's tokens at a later date. To participate, users need to delegate to the validator for a duration of six months. The validator operates at 100% fees during this time, with the airdrop amount dependent on the stake size. An hourly snapshot is taken to reserve tokens pro rata for everyone. Post-IVO, the validator operates at a 7% fee. The airdrop is split into two drops - one approximately two months into the IVO and the second after the IVO concludes. Tokenomics EtherealDAO operates with the $REAL Tokenomics system. Of the total token supply, 20% is allocated to the IVO airdrop and 80% is reserved for the DAO, under the control of the token voters. There is no predetermined allocation towards early contributors or the team. The DAO's initiators will request a 10% allocation vested over two years. All aspects of the EtherealDAO tokenomics will be decided through a DAO vote. Proceeds from the validator go directly to the DAO treasury, controlled by the tokens, less the transaction fees to deliver the airdrop. The EtherealDAO airdrop will be in the form of 20% $REAL tokens and 80% $unREAL tokens. $UnREAL tokens are a 1:1 $REAL claim that unlocks after staking liquidity in the protocol. After the Babylon hard fork, only smart wallets holding the Ethereal Validator LSU will qualify for the airdrop. Once the EtherealUSD protocol is live, participants will be able to engage in both the protocol and the IVO. EtherealDAO continues to welcome contributions and support from those interested in advancing the Radix ecosystem. Further Reading - Litepaper (https://ethereal.systems/EtherealUSD_Litepaper.pdf) ## Emberflow URL: https://radix.wiki/ecosystem/emberflow Updated: 2026-02-06 Summary: Emberflow is a technology company founded by Nathan Loranca, Danny Toro, and Peter Kim. It is the creator of LifeBand, a decentralized application designed to g Emberflow is a technology company founded by Nathan Loranca, Danny Toro, and Peter Kim (https://www.notion.so/Peter-Kim-7d85a18933674230ac918f32478e925f?pvs=21) . It is the creator of LifeBand, a decentralized application designed to give medical patients control over their personal health data. The company's leadership team brings together expertise from the fields of medicine, finance, and web technology, with the shared goal of revolutionizing healthcare through innovation and cutting-edge technology. Founding and Leadership Emberflow was founded by Nathan Loranca, Danny Toro, and Peter Kim, who serve as the CEO, CFO, and CTO of the company respectively. Nathan Loranca As CEO and co-founder, Nathan Loranca is responsible for guiding the overall strategic direction of Emberflow. Prior to establishing Emberflow, Nathan worked as a medical professional, during which time he developed the idea of decentralizing the authority of medical patients over their personal data. He transitioned to the tech industry in 2020 and led the development of LifeBand. His background in both medicine and technology plays a crucial role in bridging the gap between these two sectors for Emberflow. Peter Kim Peter Kim, the CTO and co-founder, is recognized for his exceptional talent in the world of Web3 and is a three-time winner of the Scrypto challenge. His deep understanding of the Radix DLT's smart contract language Scrypto and his ability to write high-quality code have positioned him as the leader of Emberflow's tech and engineering team. Peter's expertise is integral to the integration of smart contracts and distributed technology in the healthcare system, forming the backbone of LifeBand's infrastructure. Danny Toro As CFO and co-founder, Danny Toro brings financial acumen and strategic vision to the Emberflow team. With a robust background in finance and a profound understanding of tokenomics, Danny ensures the company's financial stability and growth. His commitment to effective financial management and his ability to navigate the complex world of cryptocurrencies make him an invaluable asset to the team. Products LifeBand LifeBand (https://www.notion.so/62da10664ad641008642ebe2603ca941?pvs=21) is a decentralized application developed by Emberflow that empowers patients with control over their medical data. It aims to replace paper-based medical records and fragmented data across multiple healthcare providers with a secure, streamlined, digital alternative. Users can store and manage their medical data in real-time, making it easily accessible to healthcare providers when needed. This not only eliminates the need to search for old medical records but also has the potential to prevent duplication of tests, misdiagnoses, and even fatal errors that can occur when patients have medical records spread across different systems. LifeBand uses smart contract technology to manage medical data, providing a secure platform for patients to have complete control over their medical records and share them easily with healthcare providers. This leads to more informed diagnoses, reduced costs, and improved patient outcomes. Through its innovative technology, LifeBand aims to transform the healthcare system in the United States and potentially across the globe, making it more efficient and patient-centric. ## EasyMoon URL: https://radix.wiki/ecosystem/easymoon Updated: 2026-02-06 Summary: EasyMoon is a cryptocurrency project that launched via 42 weekly airdrops to engaged followers. The project is built on the Radix network and has a total supply EasyMoon is a cryptocurrency project that launched via 42 weekly airdrops to engaged followers. The project is built on the Radix network and has a total supply of 42 billion $EMOON tokens. The tokens were distributed for free in a weekly giveaway that lasted for 42 weeks until the launch of Babylon (https://www.notion.so/Radix-Mainnet-Babylon-2cdc168d56e04787a9f412c23560c4b0?pvs=21) . Weekly Giveaway To participate in the giveaway, users had to install the Radix Desktop wallet and stake a minimum of 1000 $XRD. Wallets that did not meet this requirement were not eligible for the giveaway. Every Monday (UTC), participants had to send 1 $XRD to the specified airdrop address. Free tokens were sent out within the next day. Tokenomics EasyMoon has a total supply of 42,000,000,000 tokens. 60% of the tokens were distributed for free in the weekly giveaway. 15% of the tokens were used for marketing and giveaways, 15% went to liquidity pools like OCISWAP and other DEXes and exchanges, and 10% went go to the founders of the project for further development. $EMOON tokens can be purchased on Ociswap (Ociswap%20db98af5ef73e4c41b0014f6dc05fb132.md) . Criticisms EasyMoon has been criticized for relying on hype and FOMO to drive demand for the tokens. Others have questioned the sustainability of the project, as it relies heavily on the popularity of the Radix network. Additionally, the fact that the tokens are distributed for free has led to concerns about the fairness of the giveaway and the potential for manipulation. Despite these criticisms, EasyMoon remains a popular project among cryptocurrency enthusiasts, particularly those who believe in its potential for growth. ## DogeCubeX URL: https://radix.wiki/ecosystem/dogecubex Updated: 2026-02-06 Summary: DogeCubeX is a centralized decentralized exchange (cDEX) operating on the Radix Network. It provides users with a platform to buy and sell various listed tokens DogeCubeX is a centralized decentralized exchange (cDEX) operating on the Radix Network. It provides users with a platform to buy and sell various listed tokens directly from their Radix wallets. DogeCubeX employs an Automated Market Maker (AMM) mechanism, which offers a dynamic price based on the order size and current market conditions. The project was initiated from the Radix first memecoin, DogeCube. https://youtu.be/N2bTBnymXUM (https://youtu.be/N2bTBnymXUM) Overview The functionality of DogeCubeX is user-friendly and efficient. To execute a swap, users send their tokens to the specified pool address. Within approximately five seconds, DogeCubeX sends the swapped tokens back to the user's wallet. The exchange rate is determined by a simple formula, identical to Uniswap v2, and it depends on the sent amount and available liquidity. While sending tokens, users have the option to add a message that will specify refund conditions in the event of a sudden price change. Fees and Limits Typically, the fee applied to each swap consists of three parts: - Pool Fee (0.5%): Reward for liquidity providers supplying tokens to the pool. - Exchange Fee (0.5%): Supports the operation of DogeCubeX. - Transfer Fee (0.1 $XRD): Covers the Radix Network fee for transferring tokens back to the user. In cases where a swap fails (e.g., sudden price change and insufficient sent amount for purchase), only the Transfer Fee is applied. However, if the swap fails due to user error (e.g., encrypted message, request for a token not in the pool, overly high amount), a Refund Fee of 0.5 $XRD is applied. DogeCubeX also implements order limits to prevent spamming. The minimum order is 1 $XRD and the maximum is 500 $XRD. These limits are higher for DogeCube Node stakers. Orders that fall outside these parameters are refunded, with an additional fee applied. However, larger amounts can be swapped by splitting them into multiple transactions. Supported Tokens DogeCubeX continuously adds new tokens from high-demand and reliable projects. These tokens, along with their current prices, are listed on the DogeCubeX website. How to Swap To execute a swap, users must fill out their order on the Swap page and follow the given instructions. After sending the specified XRD amount to the pool account, users receive the swapped tokens in their wallet almost instantly. Additional Cost In addition to the swap cost, a fee of 1% and a small 0.1 XRD fee (to cover Radix Network transaction costs) are charged. Community Engagement Users can ask questions or engage with the DogeCubeX team through their official Telegram group. The team is known for its reliable service and user-friendly approach to trading Radix tokens. ## Dogecube URL: https://radix.wiki/ecosystem/dogecube Updated: 2026-02-06 Summary: Alex Cedric Jazzer Peachy Status: 🟢 Alex Cedric Jazzer Peachy Status: 🟢 Dogecube is a decentralized exchange-traded fund (dETF) project (https://www.dogecube.io/) built on Radix. It was created in 2021 to celebrate Radix and its founder Dan Hughes (https://www.notion.so/Dan-Hughes-ce73c8c28e8446b5b4e8d997f8be3d98?pvs=21) , as well as the rise of "memecoin" cryptocurrencies inspired by Dogecoin. https://youtu.be/R8BSE2MzqR8 (https://youtu.be/R8BSE2MzqR8) Overview The project's native $DOGECUBE token has a fixed total supply of 8 billion, representing approximately one token for every human on Earth at the time of its launch. These tokens were initially distributed for free through a series of weekly "meme-brawl" airdrop events from September 2021 to March 2022, incentivizing creative meme content from the community. Dogecube aims to provide a transparent, secure, and community-driven platform for accessing a diverse basket of cryptocurrency projects launched on the Radix protocol. By utilizing Radix's smart contract capabilities and decentralized structure, Dogecube enables decentralized governance where token holders can vote on the composition of its investment basket. History Dogecube was inspired by the rise of memecoins like Dogecoin, which gained popularity in 2021 despite being created several years earlier as a joke. The Dogecoin community demonstrated there was a market for light-hearted, meme-based cryptocurrency projects. In this spirit, Dogecube was launched in 2021 as what its creators called a " social experiment (https://getradix.com/updates/communityspotlight/dogecube-community-spotlight) " - aiming to bring some fun and humor to the Radix ecosystem while also serving as an ambassador for the network's technology. The Dogecube name and branding plays off the mythical three-headed dog Cerberus, which is also the name of Radix's novel consensus protocol. This ties into the theme of Dogecube being a bigger, stronger "memecoin" suited for Radix. From September 2021 through March 2022, Dogecube held 25 weekly "meme-brawl" events where participants could earn free $DOGECUBE tokens by creating and sharing memes. This unique airdrop distribution model helped widely disseminate the 8 billion total token supply. After the meme-brawl airdrops, the Dogecube team continued developing the project with additional features like a decentralized exchange basket, NFT collection, and plans for a community-governed investment fund model. While starting as a lighthearted experiment, Dogecube evolved to showcase the capabilities of Radix's technology for decentralized finance (DeFi) applications. Technology Dogecube utilizes the capabilities of the Radix network and its smart contract language Scrypto to enable its decentralized applications and tokenomics model. Basket Composition The core of Dogecube is a basket (https://www.dogecube.io/#h.vlj7v5avebrs) containing up to 100 different cryptocurrency tokens launched on Radix, alongside an equivalent value of the network's native $XRD token to provide liquidity. This allows Dogecube to function as a decentralized exchange-traded fund (dETF). Swap Mechanism Dogecube uses the CALM algorithm (https://www.dogecube.io/#h.vlj7v5avebrs) developed for the DefiPlaza decentralized exchange to facilitate swaps between the $DOGECUBE token, $XRD, and the various altcoin tokens in the basket while minimizing impermanent loss. Supported swaps include: - $DOGECUBE ⇿ $XRD - $XRD ⇿ Altcoin tokens - Altcoin token ⇿ $DOGECUBE (routed through $XRD) Gravitational Fee To discourage rapid trading that could destabilize the basket, Dogecube implements a " gravitational fee (https://www.dogecube.io/#h.vlj7v5avebrs) " that adjusts dynamically based on swap frequency and size. The fee is higher for sells than buys to act as a dampening force. The gravitational fee calculation incorporates the swap density, time weighting, and relative swap size versus the total basket value for each token. Treasury Accumulation A portion of the fees from swaps and other sources like royalties are accumulated into treasury "chests" containing XRD reserves. 4 (https://www.dogecube.io/#h.vlj7v5avebrs) These can then fund the addition of new projects to the basket through community voting. $DOGECUBE Tokens The $DOGECUBE token is the native cryptocurrency underpinning the Dogecube project. It has a fixed maximum supply of 8 billion tokens (https://www.dogecube.io/#h.vlj7v5avebrs) , which was fully issued through a series of free airdrop events (https://www.dogecube.io/) from September 2021 to March 2022. Tokenomics The 8 billion $DOGECUBE tokens were distributed for free to participants of 25 weekly " meme-brawl (https://www.dogecube.io/) " events, where creating and sharing memes earned token rewards. This distribution model aimed to achieve a wide dissemination of tokens to bootstrap the project's community. The initial supply provision was calculated as approximately 1 $DOGECUBE per living human on Earth (https://www.dogecube.io/#h.vlj7v5avebrs) at the time. This ties into Dogecube's lighthearted origins and goal of being an accessible "memecoin" project. Token Utilities $DOGECUBE has two core utilities within the Dogecube ecosystem: - Providing liquidity for the decentralized ETF basket when paired with XRD, the native Radix token. - Governance voting rights (https://www.dogecube.io/#h.vlj7v5avebrs) when $DOGECUBE is staked as "DOGECUBED". Each $DOGECUBED token represents 1 vote on proposals to add or remove tokens from the ETF basket. In addition to liquidity and voting rights, $DOGECUBE can also be used for fees, tipping, and any other transactions enabled by the Radix network. Products and Services Dogecube's core product is its decentralized exchange-traded fund (https://www.dogecube.io/#h.vlj7v5avebrs) (dETF) basket containing up to 100 cryptocurrency tokens from projects built on the Radix network, alongside XRD to provide liquidity. This allows users to gain exposure to a diverse portfolio of Radix ecosystem tokens through a single investment in $DOGECUBE. The project currently offers the following products and services: Decentralized ETF Basket Users can swap between $DOGECUBE, $XRD, and the various altcoin tokens in the basket through an automated market maker (https://www.dogecube.io/#h.vlj7v5avebrs) (AMM) model utilizing the CALM algorithm. This provides liquidity for the basket tokens. AMM Liquidity Pools In addition to the unified basket pool, Dogecube previously launched DogeCubeX (DogeCubeX%20f6b469bd8b8e4b199ce92187e555aa28.md) , a platform with separate liquidity pools for individual pairings of altcoins and $DOGECUBE. This demonstrated AMM capabilities before the unified basket. NFT Collection Dogecube has also launched an NFT collection (https://www.dogecube.io/#h.vlj7v5avebrs) called $WOW with 10,000 unique NFTs available for collectors to mint by sending 50 $XRD. Future Products Looking ahead, Dogecube plans to fully implement its DAO governance model (https://www.dogecube.io/#h.vlj7v5avebrs) , where $DOGECUBE stakers can vote on proposals to add or remove tokens from the basket, effectively operating the dETF as a community-governed index. Governance Dogecube implements a decentralized autonomous organization (DAO) model where governance decisions are made through community voting by $DOGECUBE token holders. This includes votes on the composition of the project's core dETF basket. Voting System Dogecube's voting system (https://www.dogecube.io/#h.vlj7v5avebrs) is based on staking $DOGECUBE tokens as "DOGECUBED", where each $DOGECUBED token represents one vote. The maximum number of votes is capped at 10 billion. In addition to token stakers, voting power is also granted to those running validator nodes (https://www.dogecube.io/#h.vlj7v5avebrs) for the Radix network by staking $XRD. Each 0.00000005% of staked $XRD equates to one vote, up to a maximum of 2 billion votes. Any new unstakes of DOGECUBE or $XRD during an active proposal vote are not counted. Once voting begins, the staked tokens are locked until the voting period ends. Voting Process Proposal votes (https://www.dogecube.io/#h.vlj7v5avebrs) last for 7 days total - 5 days for token holders to research and discuss the proposal, followed by a 2 day voting period. To pass, a proposal must receive a majority of "yes" votes from the combined DOGECUBED and node staking votes. If passed, the winning proposal is implemented, such as adding a new token to the dETF basket by swapping its reserve from the accumulated treasury "chest". This governance process allows the Dogecube community to collectively decide the future composition and direction of the project's investment offerings. Team and Community The Dogecube project was started by a team of individuals who had been part of the Radix community for several years. Core Team While the full team is not publicly listed, some key members mentioned include: Alfred DuLaire (Mr. PeanutButter) - Founder and driving force behind the Dogecube concept and community initiatives like the meme-brawl events. Alex - Described as the biggest Radix supporter and "community catalyst" who contributed significantly to Dogecube. Cedric - A top developer who created the "Radix Collection" NFT platform and other tools for the ecosystem. Jazzer - The developer behind the DefiPlaza decentralized exchange, whose CALM algorithm is utilized in Dogecube's swap mechanism. Peachy - An early, unwavering supporter of the Radix protocol dating back over 10 years. The team was able to coalesce organically through the growing Radix community over years of discussions and relationship building. Community From its origins, Dogecube has emphasized building and engaging an active, participatory community. This began with the 25 weekly meme-brawl events used to fairly distribute the initial token supply through funny, creative content. The project maintains a presence across social media, chat platforms, and community calls to interact with Radix ecosystem developers, enthusiasts, and potential new users. As Dogecube transitions to its full DAO governance model, the community will play a pivotal role in voting on proposals to shape the project's direction and dETF basket composition going forward. ## DeXian Protocol URL: https://radix.wiki/ecosystem/dexian-protocol Updated: 2026-02-06 Summary: DeXian Protocol, formally known as DeXian Staking Earning (DSE), is a liquid staking protocol based on Radix. The protocol offers users a unique opportunity to DeXian Protocol, formally known as DeXian Staking Earning (DSE), is a liquid staking protocol (https://docs.dexian.io/) based on Radix. The protocol offers users a unique opportunity to join a liquid staking pool and earn stable proceeds (https://thesurferinvestor.com/interview/interview-martin-ceo-of-dexian-protocol/) through the DeXian service. Notably, users can immediately redeem their $XRD investments without the typical waiting period associated with the Radix unstaking delay. DeXian: Crypto Lush (https://www.youtube.com/watch?v=mM0TN_W6P4s&t=161s) DeXian: Crypto Lush Background Developed by the technical team of KaiYuan Epoch Validator (https://dashboard.radixdlt.com/network-staking/validator_rdx1sweyknyzw9lzh6hdjeq2avh9gg5g6l7dd8he77khnv76ac8ut84hq4) , DeXian has been actively involved with Radix since 2018. The team participated in early Betanet network testing in 2021 and has been operating validators on Olympia (https://www.notion.so/Radix-Mainnet-Olympia-8fa2123a247c4e60853253ca2ab3e40e?pvs=21) and Alexandria (https://www.notion.so/Radix-Developer-Environment-Alexandria-ab78b3f478594b4d8905360c2fade517?pvs=21) . DeXian also contributed (https://docs.dexian.io/) to a series of Radix Scrypto-Challenges and played a pivotal role in promoting Radix within the Chinese community by translating key white papers, including those on Radix DeFi and the Cerberus (https://www.notion.so/Cerberus-Consensus-Protocol-d5755bce114d42e781128ee6a5603dcd?pvs=21) consensus. Features DeXian Protocol operates as a platform that offers various services for users to participate in staking and lending (https://docs.dexian.io/) on the Radix network. Here are some key features and components of the DeXian Protocol platform: Liquid Staking DeXian Protocol provides a liquid staking service that allows users to stake their tokens on the Radix network while maintaining liquidity. This means that users can stake their tokens and still have the flexibility to use or trade them without waiting for the typical unstaking period. Fast Unstake The platform offers a fast unstaking feature, enabling users to redeem their staked tokens quickly. This allows users to access and use their tokens without having to wait for the standard unstaking period, which can range from 10 to 15 days on the Radix network. Stable Returns DeXian Staking Earning (DSE) platform offers users stable returns on their staked assets. By participating in the liquid staking pool, users can earn rewards without sacrificing liquidity or waiting for the unstaking period DeXian Lending Protocol (DLP) DeXian Protocol includes its own decentralized lending protocol called the DeXian Lending Protocol (DLP). DLP utilizes RadixDLT's Scrypto technology and implements algorithmic lending models to determine interest rates and provide efficient and flexible lending options. This feature allows users to lend their assets and earn returns on their holdings. Delegation Services The platform offers delegation services, allowing users to delegate their tokens to validators on the Radix network. By delegating their tokens, users can actively participate in the consensus mechanism of the network and contribute to its security and operation. These features and components of the DeXian Protocol platform provide users with a comprehensive ecosystem for staking, lending, and active participation on the Radix network. Team The DeXian team (https://thesurferinvestor.com/interview/interview-martin-ceo-of-dexian-protocol/) comprises two main founders and five members, all of whom hail from the realms of computer technology and finance. The founders, Dust and Martin, bring a wealth of experience to the table. Dust, an early BTC enthusiast, has over a decade of experience in software development and has been deeply involved with projects like MakerDAO. Martin, on the other hand, holds an MSc in Finance from King's College London and has over three years of experience in the cryptocurrency and blockchain domain. Unique Selling Proposition While there are similarities between DSE and other platforms like Lido in terms of staking router, staking rewards distribution, and liquidity enhancement, DSE stands out due to its unique application scenarios and the differences between Radix and $ETH in terms of network mechanisms. Future Prospects Looking ahead, once Babylon and the native liquid stake go live, DSE plans to reimplement its services using Scrypto. This will allow any user to provide liquidity to the pool through the DSE protocol. Additionally, DeXian aims to further integrate the funding pool with the Lending Protocol, thereby diversifying the DeFi ecosystem and expanding the practical application scenarios of $XRD. Security DeXian Protocol emphasizes security and trust as crucial aspects of its platform. Here are some points highlighting the security measures and trustworthiness of DeXian Protocol: Audits DeXian Protocol undergoes thorough security audits conducted by reputable and independent third-party audit firms. These audits aim to identify vulnerabilities, ensure the platform's code quality, and enhance security measures. Code Transparency DeXian Protocol promotes code transparency by making its smart contracts and codebase publicly accessible. This allows the community and security experts to review and verify the code, increasing trust and contributing to a more secure platform. Secure Infrastructure DeXian Protocol employs robust security measures and follows best practices to secure its infrastructure. This includes implementing encryption, multi-factor authentication, and regular security updates to protect user assets and data. Community Trust and Reputation DeXian Protocol values its community and strives to build trust through transparent communication, regular updates, and by being responsive to user feedback and concerns. The platform's reputation among users and the wider crypto community is an important factor in establishing trust. Decentralization The platform operates on decentralized networks like the Radix network, which is designed to ensure security through distributed consensus mechanisms. The decentralized nature reduces the risk associated with a single point of failure and increases resilience against attacks. ## DELPHIBETS URL: https://radix.wiki/ecosystem/delphibets Updated: 2026-02-06 Summary: DelphiBets is a decentralized prediction market protocol that allows users to place bets on future events. It is the first prediction market protocol exclusivel DelphiBets is a decentralized prediction market protocol that allows users to place bets on future events. It is the first prediction market protocol exclusively built on Radix. The platform utilizes its native token, $DPH, and provides a smart contract interface designed to reward user participation, commitment, and dedication. DELPHIBETS UI Showcase (https://youtu.be/gAanVoK37CM?si=akVCTWjLZId8hYFt) DELPHIBETS UI Showcase Etymology The project's name derives from the ancient city of Delphi, renowned in Greek mythology as the center of the world where the Oracle made future predictions and prophecies. History The concept of DelphiBets was founded with the idea of creating a decentralized platform where users can measure their predictive capabilities against others. Taking inspiration from the Oracle of Delphi, the team aimed to create a platform where proof of successful predictions could be provided. The project was developed in stages, beginning with the ideation and validation phase, and moving through prototyping, community engagement, and smart contract development, before concluding with the launch of the protocol alongside the Babylon upgrade of the Radix DLT. Tokenomics The platform's native token, DPH, is used for participating in predictions, staking, governance, and more. The total supply of DPH tokens is capped at 99,999,999. The distribution of tokens is as follows: - 40% for Incentives & Rewards - 24% for Airdrops & Sales - 24% for Team & Development - 8% for Marketing & Other - 4% for Advisors & Consultants As of the current state, the DPH token is listed on various exchanges including OCISWAP, DogeCubeX, CaviarSwap, RadixPlanet, Astrolescent, and DSOR. Team DelphiBets is backed by an experienced and diverse team with expertise in areas such as blockchain technology, software development, user experience design, law, and entrepreneurship. Core Team Martin - Project Management: An experienced professional in Private, Corporate, and Investment Banking, Martin brings a wealth of knowledge to the Delphibets project. As the Project Manager, Martin is responsible for overseeing the project's direction, ensuring everything aligns with its long-term goals. His understanding of traditional finance (TradFi) helps him guide the development of DelphiBets within the decentralized finance (DeFi) landscape. Jonas - Marketing & Social Media: With a background in IT-based data analysis and social sciences, Jonas applies his expertise in behavioral analysis to oversee marketing and community development. He is a longtime member of the German Radix community and is committed to enhancing the development of the Radix ecosystem. Jonas plays a crucial role in promoting Delphibets and growing its community through strategic marketing initiatives. Felix & Chris - Scrypto Development: Both Felix and Chris are key players in the development of Delphibets' underlying technology. Felix, an expert in machine learning and software development, found his niche in Scrypto after experiencing limitations on other Layer 1 networks. Similarly, Chris, a seasoned software developer with a degree in mathematics, brings his expertise in backend development and complex mathematical functionalities to the project. Both see enormous potential in prediction market protocols, and their work is integral to Delphibets' functionality and success. Marc - User Experience & Interface Design: Marc, a freelance industrial designer with a portfolio of renowned 3D designs and graphic conceptions, is in charge of the frontend development and the design of Delphibets. His aim is to create an intuitive, user-friendly interface that simplifies interactions with the platform and enhances the overall user experience. Ivan - Frontend Development: With over 20 years of experience as a frontend developer, Ivan brings significant expertise in website creation, search engine optimization, and online marketing to the team. His knowledge of backend practices and coding experience allows him to design a smooth, efficient, and user-friendly experience. Advisors Florian Pieper: Florian is a co-founder of Ociswap and an experienced senior developer with a strong background in backend development and machine learning. As an advisor, his perspective on the development of smart contracts and backend implementations is invaluable. As a well-respected figure within the Radix community, his insights play a critical role in Delphibets' development. Marco: A co-founder of Ociswap and serial entrepreneur, Marco has been instrumental in building up the German Radix community. His marketing vision and experience in frontend development make him an important advisor for Delphibets, contributing to strategic decision-making and the enhancement of user experience. Legal Christiana: Christiana is responsible for all legal affairs of Delphibets, ensuring that both the project and the DPH token comply with applicable laws. With experience in major global media companies and tech start-ups, she provides legal guidance to Delphibets, ensuring its continued growth and adherence to legal regulations. Research Niko: Niko's role involves extensive research in information systems, with a specific focus on DLTs/DAOs and their governance mechanisms. He brings academic insights into the practical application of these technologies, helping shape Delphibets' development. Community Moderation Marco Michelino and MoxNoxx955: Both Marco Michelino and MoxNoxx955 are integral members of the Delphibets community. They moderate discussions, host events, and facilitate engaging interactions within the community, helping foster a supportive and active environment for Delphibets' users. Staking and Governance Users can stake their DPH tokens on DelphiNode to secure the Radix network and receive regular DPH airdrops, as well as higher airdrops during prediction events. Stakers can also participate in the governance of the protocol. Roadmap The project's roadmap lays out a strategic plan for its future growth, contingent on the technical progress of Radix DLT. Future plans include the integration of oracle services, incentives for participation, expansion of betting options, and the introduction of community voting mechanisms. ## DeFiPlaza URL: https://radix.wiki/ecosystem/defiplaza Updated: 2026-02-06 Summary: DefiPlaza is an open source, decentralized exchange (DEX) on Ethereum and Radix, aiming to provide efficient and sustainable decentralized finance (DeFi) soluti DefiPlaza is an open source, decentralized exchange (DEX) on Ethereum and Radix, aiming to provide efficient and sustainable decentralized finance (DeFi) solutions. The platform emphasizes providing optimal income for its liquidity providers rather than aiming for high trade volumes. https://youtu.be/ASIu12VhCG8?si=Yy-CfuIxgd708vpV (https://youtu.be/ASIu12VhCG8?si=Yy-CfuIxgd708vpV) History In the wake of the expanding DeFi landscape, DefiPlaza sought to transition beyond Ethereum and explore additional blockchain platforms. With a unanimous community vote, the project revealed plans to develop a DEX on the Radix L1 chain following the Babylon upgrade. - Ethereum DEX: DefiPlaza is recognized as one of the most efficient DEXs on the Ethereum network, offering up to 120 unique trading pairs. - Radix Expansion: Leveraging the capabilities of the Radix Engine, DefiPlaza aims to design a DEX that addresses vulnerabilities commonly found in Ethereum-based smart contracts. Additionally, Radix's transaction cost dynamics are anticipated to be more favorable compared to Ethereum's. DeFiPlaza's transition to Radix was influenced by the challenges faced while operating on the Ethereum network. The initiative to shift towards Radix was driven by the platform's potential to offer a better user and developer experience, fostering a more secure environment for complex DeFi designs. Core Features - Multi-Token Trading: The platform provides direct trading between multiple token pairs, reducing the need for intermediary swaps. - Single-Sided Liquidity: DefiPlaza allows liquidity providers to be exposed to the price movements of only the token provided and DFP2, the internal operational token of the platform. - Concentrated Liquidity: To enhance capital efficiency, DefiPlaza will implement automated liquidity concentration, adjusting it based on data and market conditions. - Dynamic Fees: The platform will adjust trading fees based on market conditions, ensuring sustainability and profitability. - Token Launch Platform: An integrated platform for third-party projects to introduce and auction new tokens, which, if successful, will be immediately listed on DefiPlaza. https://youtu.be/J8V1fG66Z54?si=FpSdGGCmnRWml7at (https://youtu.be/J8V1fG66Z54?si=FpSdGGCmnRWml7at) Team The project was envisioned and realized by its three co-founders: - Jazzer9F: The project's visionary, with a background in electrical engineering and a longstanding engagement with the Radix community. - UI_guy: Specializing in front-end development, he's responsible for the platform's user interface. - djtrebel: Brings startup experience and a knack for community engagement, also associated with Astrolescent (Astrolescent%206973417ff80440f2a816a1ed2465961e.md) , a DEX aggregator. Sustainable DeFi With a focus on long-term sustainability, DefiPlaza intends to offer token swapping services that can continually generate income. By learning from existing DeFi platforms and their shortcomings, such as UniswapV3's LP performance, DefiPlaza plans to provide more reliable and profitable solutions for liquidity providers. Performance Metrics - Liquidity: $442 K - Trade Volume: $80.3 M - Fees: $102 K - Trading Pairs: 124 Roadmap Past Milestones: - October 2021: DefiPlaza's launch on Ethereum. - November 2021: Integration with the 1inch aggregator. - February 2022: Website redesign. - April 2022: WalletConnect implementation. - July 2022: StablePlaza launch. - September 2022: Bridge DFP2 to Radix. - October 2023: DefiPlaza's Radix launch. ## CrumbsUp URL: https://radix.wiki/ecosystem/crumbsup Updated: 2026-02-06 Summary: CrumbsUp is an end-to-end platform designed to address the needs of Decentralized Autonomous Organizations (DAOs) built on Radix. The team also runs CrumbsNode. CrumbsUp is an end-to-end platform designed to address the needs of Decentralized Autonomous Organizations (DAOs) built on Radix. The team also runs CrumbsNode (CrumbsNode%20a8d42a5a5b5c4a7b8394bbc542a2fb79.md) . https://youtu.be/_KVVsLpFiDw?si=asDjJPA8WqnQQjhw (https://youtu.be/_KVVsLpFiDw?si=asDjJPA8WqnQQjhw) Introduction CrumbsUp aims to provide easy initiation and rapid scalability for DAOs without requiring technical expertise. The platform offers customizable features like proposal and voting mechanisms using a DAO's own governance token. CrumbsUp follows a Minimum Viable Product (MVP) approach with gradual enhancements to ensure quick time-to-market. The native $CRUMB token and CrumbsNode validator are also key components of the overall Crumbs ecosystem. Purpose DAOs represent a novel approach to organizational structures and governance. As decentralized organizations, DAOs operate through smart contracts on blockchain networks to distribute decision-making power amongst participants. DAOs promise to reshape organizational dynamics by promoting transparency, democracy and autonomy. However, DAOs face challenges like complex technology, lack of participation, scalability issues and legal uncertainty. They require suitable tools and infrastructure to launch and operate effectively. CrumbsUp aims to fill this gap by providing an intuitive end-to-end solution. The platform enables easy DAO initiation and management. Users can access DAO spaces, connect wallets, submit proposals, review, vote and monitor activity. CrumbsUp automates processes like voting and coordinates decision-making across large, diverse groups. The $CRUMB token also facilitates ecosystem growth through structured distribution and utility. By tackling key DAO challenges, CrumbsUp seeks to revolutionize the DAO landscape and unlock their disruptive potential. The platform's comprehensive features focus on simplifying DAO adoption and empowering their success. Platform The CrumbsUp platform provides end-to-end functionality for DAOs to manage their operations. Users can access a customized DAO space and connect their crypto wallets to participate. DAOs can configure their governance rules, structure types, voting timelines and member permissions. Key features include: - DAO Access and Management: Admins can setup and customize their DAO's space. Users can view info like guidelines, members and raise proposals. - Proposals, Review & Voting: Users submit proposals which admins can review and approve. Community members can view, discuss and vote on active proposals. - Intuitive Interface: The platform offers a user-friendly interface for members to participate in DAO activities. Customizable UI for personalized experience. - Automated Processes: CrumbsUp automates manual tasks like voting, streamlining operations as DAOs scale. - Data Analytics: Dashboards and reports provide insights into DAO activities and voting results. API access enables data retrieval. Governance Types CrumbsUp allows DAOs to choose different governance models based on their needs: - Token-based: Voting power based on token holdings. Aligns influence with economic stake. - Reputation-based: Members earn reputation through contributions. Rewards active participation. - Liquid Democracy: Delegate votes to others with more expertise on specific topics. - Futarchy: Use prediction markets to choose proposals with highest expected value. - Quadratic Voting: Allocate votes non-linearly to better reflect preference intensities. DAO governance has various approaches and CrumbsUp aims to provide options for customization aligned with a DAO's values and community. Service Offering CrumbsUp utilizes a token-based model with the native $CRUMB token on the Radix DLT. Key aspects include: - Free Platform Access: Users can access CrumbsUp without fees, only network transaction costs apply. - DAO Onboarding: Free to onboard DAOs, but minimum $CRUMB holdings required to create a DAO. - Proposal Development: Free to create ideas, but $CRUMB "fuel" needed to release proposals. Aligns with $XRD price. - Community Voting: Free for communities to vote on proposals. Encourages participation. - Service Upgrades: DAOs can upgrade features for a defined $CRUMB fee, catering to diverse needs. $CRUMB facilitates ecosystem growth by aligning incentives for usage without imposing excessive fees. Tokenomics The $CRUMB token has a fixed supply of 1 billion. Distribution: - 52% for liquidity and ecosystem development. - 10% for initial DEX listing. - 10% distributed over 24 months to CrumbsNode stakers. - 8% allocated to CrumbsUp founders, locked for 2 years. - 20% burned. Public sale enables broad access. Locked token portions ensure incentives aligned for growth. Distribution strategy fosters community ownership and engagement. Team CrumbsUp was founded by a team of 5 DLT enthusiasts based in Germany: - Vinny - Leads NFT creation, designs, networking and steering for CrumbsUp. Organized first German Radix meetup. - Nils - Key role in strategic decisions and utility development. Meetup co-organizer. - Timo - Drives marketing initiatives to increase visibility and community engagement. - Martin - Responsible for frontend development of CrumbsUp platform and solutions. - Stephan - Leads backend development of core CrumbsUp platform and components. The team aims to bring value to the Radix ecosystem, having been involved as investors and community members. Their ambition evolved into building CrumbsUp to address key DAO needs. Vinny and Nils provide leadership, strategic direction and business development. Timo, Martin and Stephan contribute specialized skills in marketing, design and engineering. Together, the team combines a strong understanding of blockchain and passion for empowering the DAO space. ## CrumbsNode URL: https://radix.wiki/ecosystem/crumbsnode Updated: 2026-02-06 Summary: CrumbsNode is a node that is an integral part of the Crumbs ecosystem. It is a platform developed by the CrumbsDAO that enables other projects to bring their DA CrumbsNode is a node that is an integral part of the Crumbs ecosystem. It is a platform developed by the CrumbsDAO that enables other projects to bring their DAOs to life. CrumbsNode runs on a 24/7 monitored cluster with automatic backups, and it provides a safe and secure environment for executing consensus and block generation on the Crumbs network. Mission The mission of CrumbsNode is to provide a robust and reliable infrastructure for running decentralized autonomous organizations (DAOs) within the Crumbs ecosystem. CrumbsNode aims to empower projects with the tools and services needed to create and operate their own DAOs in a secure and efficient manner. CrumbsNode's mission aligns with the broader goals of the Crumbs ecosystem, which is to foster decentralization, transparency, and innovation in the blockchain space. By offering a stable and secure node infrastructure, CrumbsNode enables projects to build and operate DAOs that can execute consensus protocols, process transactions, and facilitate decentralized decision-making. Additionally, CrumbsNode strives to provide a seamless and user-friendly experience for projects utilizing the infrastructure. It aims to simplify the technical complexities involved in running a DAO, allowing projects to focus on their core objectives and drive their desired outcomes within the Crumbs ecosystem. Products CrumbsNode is a product and service offered by the Crumbs ecosystem. It is designed to provide a reliable and secure node infrastructure for projects that want to create and operate their own decentralized autonomous organizations (DAOs). The main features and services provided by CrumbsNode include: Node Infrastructure CrumbsNode offers a 24/7 monitored cluster with automatic backups, ensuring the availability and reliability of the node infrastructure. Consensus and Block Generation CrumbsNode enables the execution of consensus protocols and block generation on the Crumbs network, facilitating the decentralized processing and confirmation of transactions. Secure Environment CrumbsNode provides a safe and secure environment for running DAOs, ensuring the integrity and protection of the underlying blockchain network. Integration with Crumbs Ecosystem Being part of the Crumbs ecosystem, CrumbsNode allows seamless integration with other components and services offered by Crumbs, providing a complete solution for building and operating DAOs. Networks CrumbsNode is a multi-chain node infrastructure that supports several leading blockchain networks. Here are some of the blockchain networks currently supported by CrumbsNode: Ethereum CrumbsNode supports the Ethereum network, providing node infrastructure for running Ethereum-based smart contracts and decentralized applications (dApps). Binance Smart Chain (BSC) CrumbsNode also supports the Binance Smart Chain network, offering reliable and efficient node infrastructure for running BSC-based dApps, trading, and other services. Polygon (formerly Matic) CrumbsNode provides infrastructure support for the Polygon network, which offers fast and low-cost transactions on Ethereum-compatible sidechains. Avalanche (AVAX) CrumbsNode also supports the Avalanche network, which features high throughput and low latency for fast and efficient transaction processing. Polkadot CrumbsNode provides support for the Polkadot network, which utilizes a unique sharded multichain architecture and cross-chain interoperability. CrumbsNode's multi-chain approach allows projects and dApps to leverage the advantages of different blockchain networks, selecting the most suitable network for their specific needs and requirements. By using CrumbsNode, projects can benefit from reliable and efficient node infrastructure, while also enjoying the flexibility of operating across different blockchain networks. Tokenomics CrumbsNode has its own native token, $CRUMBS, which serves as the official governance token of the Crumbs ecosystem and the CrumbsUp platform. By holding $CRUMBS, node operators can benefit from various incentives, such as monthly airdrops, zero fees, and more. Staking Process The staking process on CrumbsNode involves the following steps: Obtain CRUMBS Tokens To participate in the staking process, you need to acquire CRUMBS tokens. These tokens can be obtained through various means, such as purchasing them from exchanges where CRUMBS is listed or participating in token sales or liquidity mining programs. Set up a CrumbsNode Once you have CRUMBS tokens, you need to set up a CrumbsNode to stake your tokens. This involves configuring and running the necessary software on a suitable hardware setup, complying with the system requirements specified by CrumbsNode. Stake CRUMBS Tokens After setting up your CrumbsNode, you will need to stake your CRUMBS tokens by locking them in a dedicated staking contract or mechanism supported by CrumbsNode. This locking of tokens serves as collateral and helps secure the network. Participate in Consensus and Block Generation As a staked node operator, you will be eligible to participate in the consensus and block generation process on the CrumbsNode network. Your staked tokens give you a chance to be selected as a block producer and earn rewards for generating blocks. Earn Rewards By successfully generating blocks, you will earn CRUMBS token rewards. The rewards are distributed automatically by the network according to the consensus protocols and reward mechanisms in place. Maintain and Monitor Your Node It is essential to ensure that your CrumbsNode is properly maintained and kept up to date with the latest software versions and security patches. Monitoring your node's performance and health is also important to ensure its reliable operation. Security and Trust CrumbsNode prioritizes security and trust as crucial aspects of its infrastructure and operations. Here are some key points about the security and trust measures implemented by CrumbsNode: Robust Infrastructure CrumbsNode provides a reliable and resilient node infrastructure backed by 24/7 monitoring and automatic backups. This helps ensure the availability and stability of the network, mitigating the risk of downtime or disruptions. Secure Environment CrumbsNode maintains a secure environment for running DAOs, implementing industry-standard security measures to protect against external threats. This includes measures like encryption, secure data storage, and secure network protocols. Consensus Protocols CrumbsNode uses consensus protocols that are designed to provide strong security guarantees for transaction processing and block generation. The protocols ensure the integrity and immutability of the blockchain, making it resistant to tampering or unauthorized modifications. Reputation System CrumbsNode employs a reputation system to evaluate and rank node operators based on their performance, reliability, and adherence to the network's rules. This helps ensure that trustworthy and competent node operators are incentivized and selected for participation in the network. Smart Contract Audits The smart contracts and protocols used in the Crumbs ecosystem, including those governing the operation of CrumbsNode, are subjected to rigorous audits by reputable third-party security firms. These audits help identify vulnerabilities or weaknesses and ensure that the system is resilient against potential security risks. The Crumbs ecosystem involves a decentralized governance structure, wherein token holders can actively contribute to decision-making processes and participate in shaping the network's security and trust measures. This helps foster transparency, accountability, and community involvement in maintaining the security of CrumbsNode. ## Cobra stakes URL: https://radix.wiki/ecosystem/cobra-stakes Updated: 2026-02-06 Summary: Cobra Stakes is a validator in the Radix ecosystem that provides Proof-of-Stake infrastructure services. They offer staking services for participants who want t Cobra Stakes is a validator in the Radix ecosystem that provides Proof-of-Stake infrastructure services. They offer staking services for participants who want to stake their coins and earn rewards on the Radix network. By delegating their stake to Cobra Stakes, users can participate in the consensus process and help secure the network while earning staking rewards in return. Cobra Stakes is known for its trusted and reliable infrastructure, making it a popular choice for participants looking to stake their coins and earn rewards on the Radix network. Products and service Cobra Stakes Validator is a staking service provider that specializes in Proof-of-Stake (PoS) infrastructure services. They offer services to users who want to participate in the consensus process of the Radix network by staking their Radix coins. Some of the products and services offered by Cobra Stakes Validator include: Staking Cobra Stakes Validator allows users to delegate their Radix coins to their validator to participate in the consensus process of the Radix network. Users can earn rewards for participating in this process. Validator node They operate validator nodes that are essential for the consensus process of the Radix network. The nodes are known for their reliability and security. Tools and resources Cobra Stakes Validator provides users with various tools and resources to help them manage and monitor their staked coins. They also offer support to users who need assistance with their staking process. Overall, Cobra Stakes Validator is a reliable stakeholder in the Radix network, providing participants with a secure and trusted staking service to help them earn rewards on their staked coins. Networks Cobra Stakes Validator primarily operates within the Radix ecosystem, providing staking services for Radix (XRD) coins. They focus on offering Proof-of-Stake (PoS) infrastructure solutions specifically for the Radix network. Staking To stake with Cobra Stakes Validator, you would typically follow these steps: Obtain the relevant cryptocurrency Cobra Stakes Validator primarily operates within the Radix ecosystem, so you would need to acquire Radix (XRD) coins to stake with them. Ensure that you have the required amount of XRD tokens to participate in staking. Set up a wallet You need a compatible wallet to hold your XRD tokens. Ensure that you choose a wallet that supports Radix (XRD) and provides staking functionality. Popular options include the Radix Desktop Wallet or any other Radix-compatible wallet. Delegate your tokens Once you have acquired and secured your XRD tokens in the wallet, you can delegate or stake them with Cobra Stakes Validator. This process typically involves accessing the staking section of your wallet and selecting Cobra Stakes Validator as your preferred validator. Confirm the delegation Follow the instructions provided by your wallet to confirm the delegation and lock your tokens for staking. This will involve signing a transaction using your wallet's interface. Earn staking rewards By staking your tokens with Cobra Stakes Validator, you become eligible to earn staking rewards. These rewards are typically distributed based on the amount of tokens you have staked and the duration of your stake. Security and trust Cobra Stakes Validator prioritizes security in their staking services. They take measures to ensure the safety and protection of the assets entrusted to them. Here are some of the security practices typically adopted by staking service providers like Cobra Stakes Validator: Validator Node Security Cobra Stakes Validator runs a secure and reliable validator node that participates in the consensus process of the network. This node is configured and managed in a way that reduces vulnerabilities and prevents unauthorized access. Infrastructure Security They employ robust security protocols to safeguard their infrastructure, including firewalls, encryption, and secure data storage practices. Network Monitoring Cobra Stakes Validator constantly monitors the network for any potential threats or anomalies. They take proactive measures to maintain the security of their systems and promptly respond to any security incidents. Secure Communication To protect user data and transactions, they utilize secure communication channels such as encrypted connections and protocols. ## Clarity Protocol URL: https://radix.wiki/ecosystem/clarity-protocol Updated: 2026-02-06 Summary: The Clarity Protocol is an open-source platform that enables decentralized autonomous organizations (DAOs) to manage on-chain governance. Clarity provides a use The Clarity Protocol is an open-source platform that enables decentralized autonomous organizations (DAOs) to manage on-chain governance. Clarity provides a user-friendly interface and tools for DAOs built on Cardano and Radix to create and vote on governance proposals without needing to write any code. https://youtu.be/AaN50YIB8co?si=GXMX5k-adR8ylRCQ (https://youtu.be/AaN50YIB8co?si=GXMX5k-adR8ylRCQ) History In June 2022, Clarity launched on the Cardano testnet, allowing users to start experimenting with creating and managing DAOs. After extensive testing and refinement, Clarity launched on the Cardano mainnet in October 2022 and on Radix in October 2023 (https://x.com/clarity_dao/status/1716539995220025450) . Key Concepts - Decentralized autonomous organizations (DAOs) - Groups that leverage blockchain technology to enable decentralized governance and decision-making. Clarity enables DAOs to manage on-chain governance. - Agora protocol - An audited library of open source Plutus scripts used to create and govern DAOs on Cardano. Clarity extends the functionality of Agora. - On-chain governance - Governance model where rules and decision-making are encoded on a blockchain. Clarity enables on-chain governance for DAOs. - Stakes - In Clarity, stakes are used to lock a DAO member's governance tokens to calculate voting power. - Proposals - On Clarity, DAO members can create proposals to hold votes on decisions. Proposals contain on-chain effects that execute if passed. - Voting - DAO members can vote on proposals using their stakes. Votes are recorded on-chain. Organizations vs Agoras Clarity distinguishes between Organizations and Agora DAOs. Organizations Organizations are groups that can leverage blockchain technology but rely on manual execution of governance decisions by trusted members. Pros of Organizations are that governance actions are free since they don't require on-chain transactions. They also allow for more control by restricting permissions. Cons are that they rely on trusted actors to manually enforce decisions. Agoras Agora DAOs are a subset of organizations on Clarity that rely on the Agora protocol for fully on-chain and automated governance. Agora DAOs use smart contracts to encode governance rules and autonomously execute decisions. Pros of Agora DAOs are trustless execution, transparency through on-chain record keeping, and permissionless participation based on token holdings. Cons are that governance actions require paying transaction fees. In Agora DAOs, any member can create proposals if they meet the minimum token threshold. Proposals contain on-chain effects that are executed if the proposal is approved through a vote. Votes on proposals are recorded on the blockchain. The Agora protocol supplies the on-chain governance infrastructure used by Agora DAOs on Clarity. Agora provides core components like governor contracts, stakes, and proposals. Clarity builds on top of Agora to make decentralized governance accessible for everyday users through its interface. Functionality Clarity offers various functionality to enable on-chain governance for DAOs without needing blockchain development skills: Creating and managing DAOs Clarity provides tools to create new decentralized autonomous organizations that leverage blockchain technology for governance and coordination. Participating in governance Clarity allows members of a DAO to take part in governance activities like submitting proposals, voting, staking tokens, and configuring parameters. Configuring governance parameters DAO admins can configure governance settings like token thresholds, timelocks, and voting cadences through Clarity's interface. Voting There are two main types of voting available in Clarity - polls and proposals. Polls are off-chain votes used to gauge sentiment from an organization's members. They do not execute any on-chain effects. Creating and voting in polls may be restricted to admin users depending on the organization's settings. Proposals involve on-chain voting that is recorded on the blockchain. Proposals are only available to Agora DAOs on Clarity. Any member who meets the minimum token threshold can create a proposal. Proposals contain smart contract effects that will automatically execute if the proposal is approved through the vote. Votes on proposals are submitted via transactions on the blockchain. There are key differences between the voting power used for polls versus proposals: - For polls, simply holding the governance token grants voting power based on the token balance. - For proposals, members must lock their governance tokens into stakes to gain voting power. The voting power is based on the amount of tokens locked in the stake. Proposals must go through defined stages including drafting, voting, execution, and completion based on the parameters in the DAO's governance contract. Proposal voting is permissionless for any member with enough stake. Bounties Bounties allow crowdsourcing submissions and conducting votes to synthesize collective intelligence. Team (https://clarity.community/team) - Logan Panchot: Handles community growth and fundraising. Graduated from Stanford University and has roles in business operations and community growth at Clarity. - Justin Schreiner: Leads frontend development. Has a background in Computer Science and is a Plutus Pioneer. - Matt Laux: Heads full-stack development. An engineering graduate from Texas A&M. - Ben Hart: Technical development leader with a decade of experience in the software industry. - Mark Florrison: Leads engineering operations. Founded MLabs, a Cardano and blockchain consulting company. - Chris Borders: Compliance lead and general counsel with over three decades of legal experience. - Tomasz Maciosowski, Nigel Farelly, Michał Adamczyk: Plutus Developers with a focus on Haskell and functional programming. Advisors - Mateen Motavaf: Former CEO of SundaeSwap, involved in strategic decision-making for Clarity. - Shannon Wu: Investor in the blockchain sector, advises Clarity on strategy and fundraising. - Paul Levine: Has 30 years of experience in the payments industry. - Izzat-Begum B. Rajan: Legal and tax expert with extensive experience in tax-efficient legal structuring. ## Caper URL: https://radix.wiki/ecosystem/caper Updated: 2026-02-06 Summary: Caper is a Web3 bourse and launchpad platform designed to accelerate the development of Decentralized Autonomous Organizations (DAOs) from creation and funding Caper is a Web3 bourse and launchpad platform designed to accelerate the development of  Decentralized Autonomous Organizations (https://en.wikipedia.org/wiki/Decentralized_autonomous_organization)  (DAOs) from creation and funding through to governance, exit, and acquisition. Summary Caper’s core innovation centers on ‘capers’ – continuous organizations described as "evergreen utility machines" that are designed to reward productive endeavor while eliminating inefficiencies commonly associated with traditional DAO structures. Caper specifically addresses two persistent challenges in decentralized governance: the  free-rider problem (https://en.wikipedia.org/wiki/Free-rider_problem) , where participants benefit from collective efforts without contributing, and voter fatigue, where community members become disengaged from governance processes over time. Each DAO created on the Caper platform operates through a  bonding curve (https://en.wikipedia.org/wiki/Bonding_curve)  mechanism that serves as an automated market maker, providing immediate liquidity for governance tokens and establishing deterministic pricing without traditional market-making intermediaries. The platform employs a unique governance system that calculates voting power and treasury exit rights based on both token holdings and historical voting participation, creating economic incentives for sustained community engagement. The project represents an attempt to synthesize concepts from  cooperative economics (https://en.wikipedia.org/wiki/Cooperative) ,  anarcho-syndicalism (https://en.wikipedia.org/wiki/Anarcho-syndicalism) , and  Austrian school economics (https://en.wikipedia.org/wiki/Austrian_school)  within a blockchain-based organizational framework, positioning itself as a solution to what its developers characterize as the limitations of both centralized governance systems and existing DAO implementations. Background and Development Motivation and Problem Statement Caper's development was motivated by perceived failures in both centralized governance systems and existing DAO implementations. The platform's creators cite historical examples of governmental overreach and economic mismanagement as justification for decentralized alternatives, noting that  communist regimes alone were responsible for 65 million deaths (https://en.wikipedia.org/wiki/Mass_killings_under_communist_regimes)  in the hundred years following 1917, according to estimates by historian  Stephen Kotkin (https://en.wikipedia.org/wiki/Stephen_Kotkin) . The whitepaper characterizes modern governments as "haphazard at best, determined enemies of human flourishing at worst," positioning DAOs as offering "the hope of rapid, low-risk governance experimentation on willing participants." The project also addresses monetary policy concerns, particularly the  debasement of fiat currencies (https://fred.stlouisfed.org/graph/?g=1jThZ)  since the abandonment of the gold standard. According to the platform's documentation, the US Dollar has lost over 99% of its purchasing power since 1971 due to unrestrained money printing, creating price distortions that benefit those who understand the monetary system while disadvantaging others. This analysis aligns with  Austrian school economic theory (https://en.wikipedia.org/wiki/Austrian_school) , which emphasizes the importance of sound money and free markets. Regarding existing DAO implementations, Caper's developers argue that the promise of decentralized governance has not been realized largely due to technical limitations of underlying blockchain infrastructure. They identify several persistent problems in the DAO ecosystem, including the  free-rider problem (https://en.wikipedia.org/wiki/Free-rider_problem)  where participants benefit from collective efforts without contributing,  voter fatigue (https://en.wikipedia.org/wiki/Voter_turnout)  leading to low governance participation, and vulnerability to governance attacks such as those experienced by protocols like  Rook DAO (https://rook.fi/) . Philosophical Foundation The platform represents an attempt to synthesize ideas from  cooperatives (https://en.wikipedia.org/wiki/Cooperative) ,  anarcho-syndicalism (https://en.wikipedia.org/wiki/Anarcho-syndicalism) , and  Austrian economics (https://en.wikipedia.org/wiki/Austrian_school) , with the stated goal of reconciling what the developers characterize as "fruitless dichotomies" between left-wing and right-wing politics, as well as between capitalism and collectivism. This approach reflects broader trends in the  Web3 (https://en.wikipedia.org/wiki/Web3)  movement toward creating alternative economic and governance structures that operate outside traditional nation-state frameworks. Platform Features Bonding Curve Mechanism Caper's core technological innovation centers on its implementation of  bonding curves (https://en.wikipedia.org/wiki/Bonding_curve)  as automated market makers (https://en.wikipedia.org/wiki/Automated_market_maker) (AMMs) for DAO governance tokens. Each caper operates through a mathematical function that determines token pricing based on circulating supply. This mechanism provides several key advantages over traditional liquidity pools: permissionless operation, deterministic pricing with zero slippage, and guaranteed liquidity at any price point. The bonding curve system eliminates common risks associated with externally supplied liquidity, including  rug pulls (https://en.wikipedia.org/wiki/Exit_scam) and price manipulation that have plagued other  decentralized finance (https://en.wikipedia.org/wiki/Decentralized_finance)  protocols. By holding collateral directly within the curve contract, the system removes dependence on external market makers or liquidity incentives. The mathematical structure inherently incentivizes early supporters while providing superior price stability compared to traditional AMM implementations. Cashtag System The platform employs a unique identification system called "cashtags" - alphanumeric identifiers that prevent spoofing and establish clear project identity. The cost to mint a caper is calculated as 10^(6-x) XRD, where x represents the number of digits in the desired cashtag. This pricing structure means that shorter, more valuable cashtags require significantly higher initial investment, with a three-character cashtag like "$CPR" costing 1,000 XRD tokens. The scarcity imposed by unique cashtag requirements, combined with the perpetual liquidity guarantee provided by bonding curves, creates economic incentives for valuable cashtags to be "recolonized" rather than abandoned if founding teams depart. This mechanism provides community members with potential paths back to profitability even in cases of project abandonment. Fixed Token Supply Architecture All capers implement a standardized token supply of 100 billion governance tokens, which enables direct price comparisons between projects and prevents attempts to manipulate apparent market capitalizations through supply adjustments. This fixed supply model provides transparency in ownership percentage calculations and eliminates the  Cantillon Effect (https://en.wikipedia.org/wiki/Richard_Cantillon)  associated with unbacked token issuance. Governance System Vote Weight Calculation Caper implements a sophisticated governance mechanism that calculates voting power and treasury exit rights. This mathematical approach addresses persistent challenges in  DAO governance (https://en.wikipedia.org/wiki/Decentralized_autonomous_organization) , including voter apathy and free-rider problems. The function creates natural threshold mechanisms where meaningful voting power and exit rights accrue only to members demonstrating both economic commitment through token ownership and active engagement through consistent participation in governance processes. The vote weight system provides protection against governance attacks, as temporary token acquisition without corresponding voting history cannot immediately grant disproportionate influence. This prevents the type of governance sniping that has affected protocols like Rook DAO, where attackers acquire large token positions specifically to manipulate governance outcomes. Ordinal Ranking Voting The platform implements  ordinal voting (https://en.wikipedia.org/wiki/Ranked_voting)  systems for all governance decisions, where voters rank options by preference rather than selecting single choices. In a ballot with multiple options, the highest-ranked choice receives points equal to (number of options - 1) multiplied by the voter's vote weight, with subsequent choices receiving proportionally fewer points. This method elicits more comprehensive information about voter preferences compared to simple majority systems while avoiding the complexity of  pairwise comparison (https://en.wikipedia.org/wiki/Pairwise_comparison)  methods. Supermajority Requirements Governance decisions require supermajority approval based on the formula: threshold = (1/x) × 1.5, where x represents the number of ballot options. This approach means binary votes require 75% approval, three-option votes require 50% support for the winning option, and four-option votes require 37.5% support. The supermajority requirement reflects the principle that proposals should not pass when they represent maximum controversy, ensuring broad community support before implementation. Legislative and Executive Proposals The governance system distinguishes between two types of proposals: legislative proposals that serve as non-binding signals of community intention, and executive proposals that trigger specific blockchain transactions upon approval. Executive proposals include a mandatory execution delay following vote completion, allowing members who disagree with outcomes to exercise exit rights before implementation. This mechanism ensures member autonomy while maintaining community decision-making authority. Proposal creation and vote submission both require fees to deter spam and fund platform operations. Combined with the vote weight incentive system, this fee structure eliminates the need for traditional  quorum (https://en.wikipedia.org/wiki/Quorum)  requirements, enabling more agile governance where uncontroversial proposals can pass with relatively low turnout while still maintaining security against manipulation. DAO Lifecycle Creation Each DAO begins with an empty treasury that serves as the repository for value generated through organizational activities and operations. The treasury functions independently from the collateral backing the bonding curve, representing wealth accumulated through the DAO's productive efforts rather than token trading activities. New capers automatically integrate with the platform's social infrastructure, including token-gated forum access where the minimum token requirement for participation can be determined through community governance. This creates natural barriers to entry that filter participants based on economic commitment while allowing communities to adjust accessibility based on their specific needs and development stage. Trading and Market Development As capers mature, their governance tokens become available for trading through both the primary bonding curve market and secondary peer-to-peer transactions. The platform facilitates atomic swaps between different caper tokens, creating immediate cross-ecosystem liquidity that enables users to diversify their DAO participation or migrate between projects without converting through base currencies. Market dynamics develop organically as communities grow and demonstrate value creation, with token prices reflecting both speculative interest and fundamental organizational performance. The guaranteed liquidity provided by bonding curves ensures that even smaller or newer capers maintain tradable markets, removing the bootstrapping challenges that typically affect early-stage projects seeking to establish market presence. Fundraising and Growth Project founders participating in price vesting programs experience gradual access to their reserved token allocations as their capers achieve predetermined value milestones. This staged release process aligns founder incentives with long-term project success while providing mechanisms for sustainable funding acquisition throughout organizational development phases. Existing blockchain projects can migrate to the Caper ecosystem through vampire curve mechanisms that effectively bridge tokenomics from legacy systems. This process enables established communities to benefit from Caper's governance and liquidity infrastructure while maintaining continuity with their existing member base and operational history. The Caper Venture Fund's automatic stake acquisition in new projects creates ongoing relationships between the platform and launched DAOs. Projects receiving CVF support gain access to promotional opportunities, technical assistance, and potential partnership networks that can accelerate growth and improve operational effectiveness. Governance DAOs experience governance maturation as members accumulate voting history and develop deeper engagement with community decision-making processes. Early-stage capers typically feature broader member participation as individuals work to build vote weight, while mature organizations often develop specialized governance classes of highly engaged members who take primary responsibility for strategic decisions. The transition from legislative signaling to executive proposal implementation marks important developmental milestones, indicating community readiness to undertake binding financial and operational commitments. The mandatory execution delays following executive votes provide ongoing protection for member interests while enabling decisive collective action. Treasury Development and Value Accumulation Successful capers accumulate treasury value through various mechanisms including revenue generation from operational activities, strategic investments, and collaborative partnerships with other DAOs in the ecosystem. Treasury growth directly benefits all members through increased exit value potential, creating alignment between individual and collective interests. The distinction between collateral backing and treasury holdings becomes increasingly significant as organizations mature, with treasury assets representing the productive capacity and accumulated wealth of the community rather than simply the trading value of governance tokens. Exit and Dissolution Member departure from capers occurs through the combination of token liquidation and treasury claim realization based on accumulated vote weight. Long-term participants who have contributed substantially to governance processes can realize significantly greater returns than passive token holders, reflecting their contribution to organizational value creation. The platform's exit mechanisms serve as continuous accountability measures for DAO management and strategic direction. Communities that fail to generate value or maintain member engagement face natural dissolution pressure as participants withdraw both their capital and governance participation in favor of more productive alternatives within the ecosystem. ## Blue Chick NFTs URL: https://radix.wiki/ecosystem/blue-chick-nfts Updated: 2026-02-06 Summary: Blue Chicks NFTs is an exclusive collection of 9,999 immutable & procedurally generated chicks with proof of ownership stored on the Radix network. Each Blue Ch Blue Chicks NFTs is an exclusive collection of 9,999 immutable & procedurally generated chicks with proof of ownership stored on the Radix network. Each Blue Chick is unique & genetically constructed the moment you mint it. Overview The purpose of the Blue Chick collection goes beyond just creating NFTs. It aims to foster a community-driven project that promotes Radix adoption and enhances the utility of NFTs as a whole. We believe that NFTs & Gaming are the perfect play to introduce the Radix DeFi ecosystem to the wider audience. By fostering a community-driven approach and establishing a robust DAO, the collective efforts of our community will breathe life into the Blue Chicks narrative and cultivate an ever-expanding ecosystem. So, what can Blue Chicks do? Aside from being very cool profile pictures, the utilities that these chicks come with range from things like assembling your platoon within the PVP P2E Artillery Tactical Game, giving you access to the DAO and its Treasury management, community events, claiming limited NFTs, airdrops, and more! Fight in a PVP P2E Artillery Tactical Game One of the first waypoints in the Blue Chicks universe is the Artillery Tactical Game, where your ultimate objective is to lead a platoon of x5 fierce chicks and emerge as the sole surviving force on the battlefield. Each Blue Chick’s distinctive attributes will determine the NFT weaponry they’ll receive via airdrops. ✨ Make your Chicks work for you. Big things are in store for the Blue Chick holders, one of which is the ability to let your chick be recruited by other players, enabling you to earn a portion of the spoils from their victorious battles. Juicy, right? 👀 Reserve your seat at The Mighty Chicks DAO Each Blue Chick derived from the Genesis 9,999 mint has the power to partake in governance, asset control, and tournament organization on the path to reaching total decentralization and community ownership. The DAO will receive funding through several channels, such as: - 20% from the Genesis x9999 Blue Chick mint - Up to 90%* of the proceeds from additional NFT mints - Royalties from the secondary market of every NFT sale - Investing, trading, or executing DeFi strategies - Over time, the percentage allocation will gradually increase until it reaches 100%. READ MORE ABOUT THE DAO (https://medium.com/@blue.chick.nfts/the-mighty-chick-dao-empowering-nft-holders-to-shape-the-future-of-the-blue-chick-project-5e2880ae0ac3) What’s next? Lots of alpha…We have big plans for the Blue Chicks NFTs besides delivering awesome gaming experiences, growing the DAO’s treasury, and sharing the gains. Above all, we are focusing on building a community and binding it together to create a feeling of true belonging and ownership that will help in deciding the development, growth, and direction of the Blue Chicks story and ecosystem. If you want to be part of this journey, be sure to join our Discord, share your thoughts and ideas — and we’ll catch you Chicks on the RADIX network! To meet new friends… - Come say Hello in the  Discord (https://discord.com/invite/SHVcwHKUNG) - Follow @BlueChickNFTs on  Twitter (https://twitter.com/blueChickNFTs) - Check our  Website (https://bluechicknfts.wtf/)  & Join the waitlist! ## Blockshard URL: https://radix.wiki/ecosystem/blockshard Updated: 2026-02-06 Summary: Blockshard is a premier Web3 infrastructure provider based in Switzerland. They are committed to delivering enterprise-grade security and reliability for all yo Blockshard is a premier Web3 infrastructure provider based in Switzerland. They are committed to delivering enterprise-grade security and reliability for all your staking needs. Blockshard offers a reliable and secure staking service, providing proven track record, high uptime, advanced security measures, competitive staking rewards, customer support, automated and compounding rewards, and governance participation. Overview Blockshard is a Switzerland-based company that provides reliable and secure staking services for proof-of-stake (PoS) blockchains. Previously known as LetzBake, it is a founding member of UnitedBloc, a global alliance of professional validator organizations that offer custom blockchain solutions, RPC, and white label nodes across 75+ blockchains. Blockshard allows its users to earn rewards by staking their cryptocurrencies and has partnered with top industry protocols like Tezos, Cosmos, Polkadot, and Kusama. The company provides a non-custodial staking service, which means users retain full control over their assets, while Blockshard ensures their safety. As of September 2021, Blockshard has been rated AA and achieved a staked value of over $1.3 million in 90 days. Mission The mission of Blockshard is to provide accessible and user-friendly staking services for holders of various cryptocurrencies. Their aim is to simplify the process of participating in staking and make it more inclusive for a broader range of users. By offering a secure and reliable infrastructure for validating transactions on different blockchain networks, Blockshard contributes to the growth and decentralization of these networks. Blockshard also focuses on promoting transparency and trust within the blockchain ecosystem. By providing information about validators' infrastructure and performance, they enable users to make informed decisions while selecting validators to delegate their tokens to. This transparency helps foster a sense of trust between users and the staking service. Overall, the mission of Blockshard is to make staking accessible, secure, and user-friendly, thereby encouraging wider participation and accelerating the adoption of blockchain technology. Services Blockshard offers a range of services focused on providing reliable and secure staking solutions for proof-of-stake (PoS) blockchains. Here are some key services provided by Blockshard: Staking Services Blockshard allows users to stake their cryptocurrencies on various PoS blockchains, enabling them to earn staking rewards. By participating in staking through Blockshard, users can contribute to the security and governance of the supported blockchain networks. Validator Infrastructure Blockshard provides professional validator infrastructure for multiple blockchains. They ensure high availability, security, and scalability of the staking infrastructure, allowing users to delegate their tokens to Blockshard's validators and participate in staking without the technical complexities. Non-Custodial Solutions Blockshard offers non-custodial staking services, giving users full control over their assets. With their non-custodial setup, users retain ownership of their tokens while Blockshard securely manages the technical aspects of the staking process. Custom Blockchain Solutions As a member of UnitedBloc, Blockshard collaborates with other professional validator organizations to offer custom blockchain solutions. These solutions include RPC (Remote Procedure Call) services and white label nodes for various blockchains, assisting businesses and projects in deploying and managing their own blockchain infrastructure. It's worth noting that Blockshard partners with and supports a wide range of blockchains, including Tezos, Cosmos, Polkadot, Kusama, and many others. By providing their services across multiple blockchains, Blockshard offers users access to a diverse set of staking opportunities. Supported Networks Blockshard supports a wide range of blockchain networks for staking and provides validator infrastructure for these networks. Here are some of the supported networks: Tezos (XTZ) Cosmos (ATOM) Polkadot (DOT) Kusama (KSM) Solana (SOL) Terra (LUNA) Avalanche (AVAX) Harmony (ONE) Mina Protocol (MINA) Elrond (EGLD) NEAR Protocol (NEAR) Celo (CELO) Algorand (ALGO) Flow Network (FLOW) Avalanche (AVAX) and more... These are just a few examples, and Blockshard supports staking services for additional networks as well. It's recommended to visit the Blockshard website or contact their team directly for an updated and comprehensive list of supported networks. Staking process The staking process with Blockshard typically involves the following steps: Choose a Supported Network First, you would need to determine which blockchain network supported by Blockshard you want to stake your tokens on. Blockshard supports a variety of networks, as mentioned in the previous response. Create an Account Register an account with Blockshard by providing the required information. This may include your email address, username, and password. Deposit Cryptocurrencies Transfer the desired amount of the supported cryptocurrency to your Blockshard wallet. The specific deposit process may vary depending on the network you choose and the instructions provided by Blockshard. Delegate or Self-Stake Once your tokens are deposited, you can choose to either delegate them to Blockshard's validators or self-stake them. Delegating your tokens involves assigning them to Blockshard's validators, while self-staking allows you to manage the staking process yourself. Monitor Staking Rewards As your tokens are staked, you can monitor your staking rewards through the Blockshard platform. Staking rewards are typically distributed based on the network's staking protocol, and Blockshard facilitates the distribution to their users. It's important to note that the specific staking process may vary depending on the supported network and any unique requirements or features of each blockchain. Blockshard aims to simplify the staking process for users, providing a secure and user-friendly interface to participate in staking. Security and Trust Blockshard emphasizes security and trust in their staking services. Here are some points regarding the security measures and trustworthiness of Blockshard: Non-Custodial Approach Blockshard operates on a non-custodial model, which means that users retain control and ownership of their staked assets. This reduces the risk associated with traditional custodial solutions where users have to transfer their assets to a third party. Validator Infrastructure Blockshard maintains a professional validator infrastructure for the supported blockchain networks. This infrastructure is designed to ensure high availability, security, and scalability, thereby mitigating potential risks and vulnerabilities. Industry Experience The team behind Blockshard consists of experienced professionals in the blockchain and cybersecurity domains. Their expertise allows them to implement stringent security measures and best practices to safeguard user assets and data. Audited Validators Blockshard strives to work with audited validators who have undergone external security audits. These audits help assess the security and reliability of the validators' infrastructure and ensure that it meets industry standards. Transparent Information Blockshard provides transparent and open access to information about their validators, including details about their infrastructure, performance, and uptime. This transparency enables users to make informed decisions when selecting validators to delegate their tokens to. Active Community Engagement Blockshard actively engages with the community and encourages an open dialogue. They provide regular updates, share insights on the staking process, and address any concerns or queries raised by their users, thereby fostering trust and transparency. It's worth mentioning that while Blockshard implements various security measures, no system is completely immune to risks. Users should also take precautions on their end, such as using secure passwords, enabling two-factor authentication, and regularly updating their security settings. ## Beaker URL: https://radix.wiki/ecosystem/beaker Updated: 2026-02-06 Summary: Beaker was an Automated Market Maker (AMM) built on Radix, with the goal of radically changing the experience of decentralized exchanges. Beaker was an Automated Market Maker (AMM) built on Radix, with the goal of radically changing the experience of decentralized exchanges. Vision The purpose of Beaker is to solve issues by making a complete overhaul of the DEX experience. Beaker identifies the current experience of providing liquidity as closer to gambling than investing. On most DEXes, after providing liquidity to a pool, users are left clueless about the performance of their investment. Beaker aims to change this by using distributed ledgers to their full capacity to automate this process. The design of Beaker revolves around three ideas: - Radically better technology: Beaker is working on making a proper abstraction of the AMM problem to find better market-making functions. Once a proper model for the problem is found, they will work on the right solution. This work should also allow them to find a better fee model that would take into account market conditions. - More serious and responsible DeFi: Beaker believes in #RealYield, meaning that liquidity providers will only receive the commissions due to them. Beaker will provide tools as turnkey solutions to simulate and track investments and risks. - Simplicity and elegance: Beaker aims to attract traditional investors but also wants to keep it simple for those who don’t want to scratch their heads too much. Beaker will propose simple and advanced tools depending on the user’s will. Features At the moment, Beaker is running a minimal version on the Radix Betanet 2. These features include swapping and providing liquidity. Beaker currently uses a Uniswap v2 AMM model where fees are not automatically compounded. After providing liquidity, users get a Position, a Non-Fungible Resource (NFR) that gives an overview of the liquidity provided. Liquidity Pools Beaker uses the constant-product market-making model for its liquidity pools. Providing liquidity comes with a risk called impermanent loss. Beaker does not plan on minting tokens to give additional rewards to liquidity providers (also known as liquidity mining). Relative Portfolio Evolution Beaker introduces the concept of relative portfolio evolution to have a better understanding of liquidity providing by taking fees into account. This concept is used to make a bet on the relative pool's rate evolution at the moment of removing liquidity. For more detailed information, please visit the official Beaker documentation (https://docs.beaker.fi/) . Tokens/Tokenomics At the moment, Beaker does not plan to emit a token. They are very sceptical about the real utility of "DAO tokens" and believe that they are mainly speculative assets that some projects use to make a profit. Therefore, issuing such tokens would be against their philosophy. Team The team behind Beaker consists of three master's degree students from Ecole Normale Superieure de Lyon (https://www.ens-lyon.fr) . - Theodore Conrad-Frenkiel, frontend development. - Arthur Vinciguerra, Scrypto code. - Guillaume Méroué, backend development. Business Model Beaker acts as a platform that matches liquidity providers and traders. This kind of platform usually takes a 15% commission fee. Beaker will therefore take up to a 15% commission on all accrued fees. They plan on making their tools free for all users. However, tools that track investments and risks will only work for their pools. If it is not sustainable, they reserve the right to charge for these tools or make them available only for liquidity providers. Further Reading https://arxiv.org/pdf/2301.08558.pdf (https://arxiv.org/pdf/2301.08558.pdf) ## BCW Technologies URL: https://radix.wiki/ecosystem/bcw-technologies Updated: 2026-02-06 Summary: BCW Technologies is an enterprise technology firm that is involved in the development of Web3 infrastructure and applications. BCW Technologies is particularly BCW Technologies is an enterprise technology firm that is involved in the development of Web3 infrastructure and applications. BCW Technologies is particularly known for their expertise in fintech and their dedication to transforming the distributed ledger technology (DLT) ecosystem. BCW Technologies is part of BCW, a global communications agency that partners with clients in various sectors including B2B, consumer, corporate, crisis management, healthcare, public affairs, purpose, and technology. Overview BCW Technologies is a company that focuses on technology development and its impact on society. They are committed to supporting the industry at the forefront of global evolution, aiming to connect humanity, drive progress, and change the way people live, work, and connect with each other. BCW Technologies specializes in web3 technologies at scale and offers Infrastructure-as-a-Service (IaaS) solutions, such as Arkhia. Arkhia provides web3 backend services for blockchain and distributed ledger technology (DLT) developers, including API suites, monitoring, analytics, and enhanced developer tools. BCW Technologies has expertise in the integration of existing products into the web3 space, serving enterprise clients who want to expand their presence in decentralized technology. They provide consulting and venture studio services to help businesses in their transformation and adoption of web3 technologies. Source: BCW Global. About (https://www.bcw-global.com/about) Mission BCW Technologies' mission focuses on leveraging technology to support industries and drive progress. The company aims to provide infrastructure, applications, and solutions that connect and interact with the digital universe. BCW Technologies is committed to transforming the distributed ledger technology (DLT) ecosystem and creating a frictionless customer experience for digital asset-based services. Their expertise lies in developing industry-leading web3 technologies at scale and offering infrastructure-as-a-service (IaaS) solutions, such as Arkhia. Arkhia provides web3 backend services for blockchain and DLT developers, including API suites, monitoring, analytics, and enhanced developer tools. While specific details of BCW Technologies' mission may vary or evolve over time, their dedication to driving innovation and transformation in the web3 and blockchain space remains central to their operations. Source: BCW Group. Mission Statement (https://www.linkedin.com/company/bcw-group2019/) Products BCW Technologies offers a range of products that aim to address the challenges businesses face when integrating blockchain technology into their operations. Here's a brief overview of those products: Hashport Hashport is a blockchain interoperability solution provided by BCW Technologies. It aims to bridge different blockchain networks, enabling seamless communication and value transfer between them. With Hashport, businesses can benefit from cross-chain transactions, improved liquidity, and enhanced data sharing capabilities. Blockpour Blockpour is a data analytics platform that enables businesses to gain deeper insights into their blockchain data. The platform leverages machine learning algorithms to analyze blockchain data and provide real-time insights, allowing businesses to optimize their operations and decision-making. TOKO TOKO is a tokenization-as-a-service platform provided by BCW Technologies. It aims to simplify the process of issuing, managing, and trading digital assets. With TOKO, businesses can benefit from increased liquidity, reduced costs, and improved accessibility, making it easier to invest in new assets or manage existing ones. Staking Services Staking is the process of holding and securing cryptocurrencies to support a blockchain network's operations. BCW Technologies provides staking services for several blockchain networks, allowing users to earn staking rewards while supporting the security and stability of the network. Arkhia As you mentioned, Arkhia is BCW Technologies' Infrastructure-as-a-Service (IaaS) solution. It provides a range of backend services and developer tools to help businesses build, deploy, and manage blockchain applications without worrying about the underlying infrastructure. Arkhia includes API suites, monitoring capabilities, analytics, and enhanced development tools designed to deliver a secure and reliable web3 backend infrastructure. Impacts BCW Technologies can have various impacts in the web3 and blockchain space. Here are some potential impacts that their products and services can have: Advancement of Web3 Technology BCW Technologies contributes to the advancement of web3 technologies by providing infrastructure, backend solutions, and developer tools. This helps developers and businesses leverage blockchain and decentralized technologies to build innovative and transparent applications. Increased Efficiency and Security BCW Technologies' solutions can enhance the efficiency and security of blockchain-based systems. By providing scalable infrastructure and secure backend solutions, BCW enables businesses to operate with greater reliability, scalability, and data integrity. Fostering Innovation By offering specialized services and consulting in the web3 and blockchain field, BCW Technologies can help businesses and organizations unleash their creative potential. They can guide and support the development of pioneering applications and use cases that leverage blockchain and distributed ledger technology. Transformation of Industries BCW Technologies specializes in fintech solutions, which can drive the transformation of the financial industry. By leveraging blockchain technology, they can enable secure and efficient financial transactions, improve transparency in financial systems, and empower individuals with more control over their financial assets. Empowering Developers BCW Technologies provides developer tools and API suites that empower developers to build decentralized applications. By offering these tools, developers can accelerate the creation of web3 solutions and contribute to the growth and development of the blockchain ecosystem. Security and Trust BCW Technologies prioritizes security and trust in its operations and solutions. Here are a few aspects that highlight their commitment to ensuring a secure and trustworthy environment: Blockchain Technology BCW Technologies leverages blockchain and distributed ledger technology (DLT) to enhance security and transparency in their solutions. These technologies are designed to provide immutable and decentralized data storage, reducing the risk of data manipulation or unauthorized access. Infrastructure-as-a-Service (IaaS) BCW Technologies offers IaaS solutions, such as Arkhia, which provides secure and reliable web3 backend services. These services include robust API suites, monitoring capabilities, analytics, and developer tools designed to deliver a secure infrastructure for blockchain and DLT applications. Security Protocols and Audits BCW Technologies implements rigorous security protocols and conducts regular audits to identify vulnerabilities and ensure their systems are resilient against cyber threats. By following industry best practices and complying with security standards, they aim to protect the integrity of their technology and user data. Consulting and Expertise BCW Technologies offers consulting services to assist organizations in understanding and implementing security measures in their digital transformation journeys. Their team of experts provides guidance on best practices for securing decentralized systems, helping clients build trust and confidence in their technology infrastructure. ## Backeum URL: https://radix.wiki/ecosystem/backeum Updated: 2026-02-06 Summary: Backeum was an online platform designed to foster a culture of appreciation and backing within the Radix community. It served as a bridge between creators and t Backeum was an online platform designed to foster a culture of appreciation and backing within the Radix community. It served as a bridge between creators and their backers, allowing creators to fundraise, share exclusive content, and reward their supporters with unique NFTs, all while building a genuine community around their work. https://youtu.be/AB1LkI3SDSA?si=zDwHMVb33GkXVUe9 (https://youtu.be/AB1LkI3SDSA?si=zDwHMVb33GkXVUe9) Overview Backeum promotes a simple culture of appreciation and backing. By extending support to those pushing forward, the platform aims to help the network grow. Users can discover someone they admire (https://backeum.com/) , tip them, and send a heartfelt 'good job' their way. In a blog post (https://backeum.com/blog/meet-backeum) introducing Backeum, the team shared their inspiration drawn from the Radix community's radvocates who are passionate about decentralized finance. The platform was envisioned as a friendly neighborhood cafe, focusing on genuine interactions and appreciation. NFTs on Backeum are likened to "thank you" cards, given by creators to express gratitude to their supporters. These NFTs can be found in the Radix Wallet, ensuring they are easily accessible, secure, and truly owned by the user. Features: - Profile: Users can create a Backeum profile (https://backeum.com/) to share information about themselves and their work. This profile allows them to post updates, receive tips and donations, and verify their identity with their XRD domain for added trust. - Community: The platform boasts a community filled with individuals who genuinely care and work hard. Users can engage with their backers, connect with others, and experience the value of mutual support. - NFT Collections: Backeum offers the unique feature of turning gratitude into NFTs (https://backeum.com/faq#NFT%20Collections) . Creators can design NFT collections to reward their backers, giving them a special token of appreciation. - Competitions: Backeum hosts competitions centered around talent and community. These friendly contests allow users to showcase their skills and let the community be both their cheerleader and jury. Frequently Asked Questions - What is Backeum? Backeum is a platform designed for creators and their backers. It enables creators to fundraise, share exclusive content, and reward their supporters with unique NFTs, all while building a genuine community around their work. - When will the Backeum dApp launch? The dApp is nearing completion and will be launching soon. Updates regarding the exact launch date will be provided on official channels (https://backeum.com/blog/meet-backeum) . - How do backers support creators on Backeum? Backers can support creators in various ways, such as through donations, purchasing or trading NFTs, participating in competitions, and accessing exclusive content. - How do I connect my Radix wallet to Backeum? Users can select the "Connect" button in the top right menu and follow the on-screen prompts. For setting up the wallet connection to the browser, users can visit Radix's official website (https://wallet.radixdlt.com/) . For more detailed information, you can visit the FAQ section (https://backeum.com/faq) on the website. ## Avaunt Staking URL: https://radix.wiki/ecosystem/avaunt-staking Updated: 2026-02-06 Summary: Avaunt Staking Services is a company that specializes in cryptocurrency staking services, facilitating the operation of validator nodes on a variety of blockcha Avaunt Staking Services is a company that specializes in cryptocurrency staking services, facilitating the operation of validator nodes on a variety of blockchain projects. As a validator operator, Avaunt Staking Services enables token holders, also known as delegators, to delegate their tokens to the company's nodes. In return, these delegators receive a portion of the network rewards. As of 2023, Avaunt Staking Services manages validator nodes on several networks including Avalanche, Radix, and the SSV Network. https://youtu.be/BJLWzqqWzFM (https://youtu.be/BJLWzqqWzFM) Overview Avaunt Staking Services was established to cater to the shift from Proof-of-Work (PoW) to Proof-of-Stake (PoS) consensus mechanisms in the blockchain industry. PoS, unlike PoW, relies on a network of validator nodes and staked collateral to secure the network, rather than computational energy. Avaunt Staking Services provides a platform where individuals and entities can delegate their staking tokens to a validator, earning a percentage of the validator's revenue in return. In addition to their primary staking service, Avaunt also offers IT and AV consultancy, with over 20 years of experience in IT infrastructure project delivery, and web design & development services, encompassing branding, design & build, hosting & domain registration, social media, SEO, and marketing. History As of 2023 (https://avaunt-staking.com) , Avaunt Staking Services is accepting delegations on the Avalanche Network and Radix. They have over $5 million staked across more than 400 delegators and are active on three networks, with plans for expansion. Mission Avaunt Staking is community-focused, with a mission to help each and every project grow and provide transparency to the delegator community (https://avaunt-staking.com) . Products & Services Avaunt Staking The company provides secure infrastructure, high availability cloud hosted infrastructure, secured in multiple data centres across the globe and monitored 24/7. They are also a trusted validator on some of the biggest Proof of Stake networks. Delegators play a critical role in the system, as they show support for reputable validators. Running a Validator Node requires a high level of technical knowledge and commitment to ensure that a consistent quality of service is maintained. Radix Dashboard https://www.radixdashboard.com (https://www.radixdashboard.com/) Consultancy In addition to staking services, Avaunt offers IT and AV consultancy with 20+ years of experience, specializing in IT Infrastructure project delivery. They also provide web design and development services including branding, design & build, hosting & domain registration, social media, SEO & marketing. Supported Networks As of 2023, Avaunt Staking Services supports the following networks: Avalanche Avalanche is an open-source platform that facilitates the launch of highly decentralized applications, new financial primitives, and interoperable blockchains. Radix Radix is a decentralized network designed to provide developers with a platform where they can build rapidly without the threat of exploits and hacks, and without scaling being a bottleneck. SSV Network SSV.network is a decentralized staking infrastructure that enables the distributed operation of an Ethereum validator, providing simple and scalable access to decentralized ETH staking. Staking Process For token holders interested in becoming delegators with Avaunt Staking Services, the process generally follows these steps: - Token holders delegate their staking tokens to a validator node managed by Avaunt Staking Services. - Once the tokens are delegated, they are added to the validator's staking pool. - Rewards, which are paid in the native network token, are distributed to the delegator's wallet at the end of each epoch (a fixed period of time unique to each network). Delegators share both the revenue and risks with their validators. If a validator behaves maliciously or underperforms, delegators may face a penalty known as slashing, in which a proportion of their delegated stake is deducted. Security and Trust To maintain a high level of service, Avaunt Staking Services operates high availability cloud-hosted infrastructure, secured in multiple data centers across the globe and monitored 24/7. The company prides itself on its commitment to the delegator community, providing transparency and actively contributing to the growth of each project. ## Arcane Labyrinth URL: https://radix.wiki/ecosystem/arcane-labyrinth Updated: 2026-02-06 Summary: Arcane Labyrinth is an adventure game that offers players the experience of navigating through a labyrinth consisting of 101 different floors. The game is set i Arcane Labyrinth is an adventure game that offers players the experience of navigating through a labyrinth consisting of 101 different floors. The game is set in a fantasy world and features an attractive pixel art style, designed to evoke a sense of nostalgia in players. Unique to Arcane Labyrinth is its play-to-earn system, which rewards players with $ARC tokens upon successful completion of each floor of the maze. About Arcane Labyrinth: Uncovering the Mysteries of a Fantasy World is an adventure game that immerses players in a magical fantasy world filled with various obstacles and challenges. The game's graphics utilize a 2D pixel art style that combines fantasy and magical elements, providing players with an interesting and fun nostalgic experience. The game's difficulty is designed to be exciting yet fair, and not overly challenging to overcome. Plot The game's story revolves around a character named Jack, a seasoned adventurer from a remote village. Jack stumbles upon a mysterious portal in the middle of a forest, which transports him to a world filled with monsters and mazes. Gameplay The objective of the game is to complete each floor of the labyrinth by battling monsters and collecting treasures scattered throughout the maze. Players can discover new weapons and magic to assist them in combat and in solving more complex maze floors. The game also includes a co-op multiplayer feature, allowing players to collaborate and assist each other in completing the floors of the maze. Tokenomics The game's economy is based on $ARC tokens, with a total supply of 100,000,000 tokens. These tokens are used for governance and play-to-earn rewards. The token distribution includes 12% for airdrops and liquidity providers. Vision and Mission The creators of Arcane Labyrinth aim to provide a fun and challenging gaming experience, with innovative features and unique level design that harkens back to classic gaming concepts. Their mission includes creating an eye-catching and stunning game design, making gameplay challenging yet fair and fun, providing a co-op multiplayer feature, crafting an engaging and innovative story, and continuously developing and improving the game. ## Anthic URL: https://radix.wiki/ecosystem/anthic Updated: 2026-02-06 Summary: Anthic is an intent-based trading platform launching on the Radix network in Q1 2025. The platform aims to bridge the gap between centralized exchange (CEX) liq Anthic is an intent-based trading platform (https://anthic.medium.com/introducing-anthic-640b39e40688) launching on the Radix network in Q1 2025. The platform aims to bridge the gap between centralized exchange (CEX) liquidity and decentralized exchange (DEX) security by implementing a hybrid architecture that combines off-ledger expression of liquidity with on-chain execution. https://youtu.be/6YEASF6diOQ (https://youtu.be/6YEASF6diOQ) Overview The platform's core innovation is its Flash Liquidity system (https://anthic.medium.com/flash-liquidity-12a566e30498) , which allows market makers to provide liquidity without pre-funding on-chain pools. This approach enables real-time, low-slippage trades by combining centralized exchange efficiency with decentralized finance (DeFi) principles. Flash Liquidity aggregates liquidity from multiple sources, including both on-ledger and off-ledger assets, while maintaining the non-custodial and permissionless nature of DeFi. At launch, Anthic will integrate with leading market makers (https://anthic.medium.com/anthic-launch-parters-33cb59c28a10) such as Keyrock, G-20, and Portofino, alongside Radix DEXs including Ociswap (Ociswap%20db98af5ef73e4c41b0014f6dc05fb132.md) , CaviarNine (CaviarNine%20aeb1379af8a145209d11aba444339559.md) , and Astrolescent (Astrolescent%206973417ff80440f2a816a1ed2465961e.md) . The platform initially targets support for at least 25 of the top crypto assets, with plans to expand coverage to the top 100 and beyond. A key feature of Anthic is its gasless trading system (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) , which enables trades to complete within 5 seconds or less. The platform utilizes Radix's native mechanisms including Subintents, an off-ledger Order Messaging System, and an on-ledger ecosystem of Solvers to facilitate efficient trade execution. Rather than introducing a new token, Anthic's economic model (https://anthic.medium.com/anthic-fees-db65f8049f52) is built around $XRD, the native token of the Radix network. The platform implements a maker-taker fee model with 100% of fees being converted to $XRD through market purchases. These fees are then distributed with 40% locked from circulation, 30% shared with integrated DEXs, and 30% allocated to development and operations. The platform entered public testnet in December 2024 (https://anthic.medium.com/anthic-live-on-testnet-8e5f416fd1ae) , featuring full end-to-end functionality including live price feeds from market maker partners, test trading capability using flash liquidity, and test solver bots matching orders as they would in production. Technology Anthic's architecture combines off-ledger liquidity expression with on-chain execution through several key technological innovations: Flash Liquidity Flash Liquidity (https://anthic.medium.com/flash-liquidity-12a566e30498) enables market makers to provide liquidity without pre-funding on-chain pools. The system uses Instamint, powered by Instabridge (Instabridge%20185dfc2a7e3d805e8701e389ce15fe61.md) , to allow market makers to mint wrapped assets on Radix as needed, using short-term (<24h) lines of credit. This approach significantly improves capital efficiency as market makers only move assets on-chain when trades are confirmed. Intent-Based Trading Anthic's intent-based system (https://docs.anthic.io/concepts/subintents) differs from traditional Automated Market Makers (AMMs) by moving order matching and coordination off-ledger while maintaining on-chain settlement. Unlike traditional DEX systems (https://youtu.be/6YEASF6diOQ) that require separate transactions for makers and takers, Anthic's intent system allows both sides of a trade to be bundled into a single transaction, significantly reducing complexity and costs. Order Messaging System The Order Messaging System (https://docs.anthic.io/integration/dex/flow_of_operations) serves as the core off-ledger infrastructure, facilitating interaction between liquidity providers, DEXs, and their users. It maintains a Subintent Pool similar to a blockchain mempool, where trade intentions are stored and matched before on-chain execution. The system enables (https://youtu.be/6YEASF6diOQ) market makers to express liquidity they hold on other venues without needing to move inventory until execution. Subintent Architecture Subintents (https://docs.anthic.io/concepts/subintents) function as signed definitions of trade parameters, authorizing transactions within predefined conditions such as price, slippage, and time limits. This approach differs substantially (https://youtu.be/6YEASF6diOQ) from other intent-based platforms like COW Swap, as Anthic's subintents integrate directly with Radix's system-level architecture rather than existing solely at the application layer. Settlement Process The settlement process (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) completes within 5 seconds, significantly faster than COW Swap's 5-minute auction process (https://youtu.be/6YEASF6diOQ) . The rapid settlement involves: - Users submit signed subintents through DEX interfaces. - Market makers provide real-time pricing streams. - The Order Messaging System matches compatible subintents. - Solvers bundle matched subintents into transactions. - Transactions are executed on the Radix ledger. Front-Running Prevention Anthic prevents front-running through its price stream mechanism (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) . Unlike COW Swap's auction-based approach (https://youtu.be/6YEASF6diOQ) , Anthic requires market makers to provide pricing before knowing specific trade details. Once a user commits to a price, the market maker cannot modify it, effectively eliminating the possibility of sandwich attacks without requiring lengthy auction periods. Features - Real-Time Trading: Anthic provides near-instantaneous trade execution (https://anthic.medium.com/anthic-live-on-testnet-8e5f416fd1ae) , with transactions typically completing within 5 seconds. This represents a significant improvement (https://youtu.be/6YEASF6diOQ) over other intent-based platforms like COW Swap, which requires 5-minute auction periods for trade execution. - Cross-Chain Asset Support: The platform enables trading of any crypto asset (https://anthic.medium.com/introducing-anthic-640b39e40688) issued on Radix. Market makers can express liquidity (https://youtu.be/6YEASF6diOQ) they hold on other venues without moving inventory until execution, allowing them to efficiently provide deep liquidity across multiple chains. - Capital Efficiency: Through Flash Liquidity (https://anthic.medium.com/flash-liquidity-12a566e30498) , market makers can provide liquidity across multiple assets without locking up capital in specific pools. This addresses a key limitation (https://youtu.be/6YEASF6diOQ) of other intent-based platforms like COW Swap, where market makers must pre-fund their positions with on-chain inventory before trading. - User Security: All trades maintain full non-custodial security (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) through the use of cryptographically signed subintents. Users retain complete control of their assets throughout the trading process, with transactions only executing when all predefined conditions are met. - Gasless Transactions: Users do not need to hold $XRD (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) for transaction fees, as trades can be executed using the traded asset itself. The bundling of maker and taker actions (https://youtu.be/6YEASF6diOQ) into single transactions further reduces gas costs compared to traditional DEX approaches requiring separate transactions. - Price Improvement: The platform implements a sophisticated price aggregation system (https://anthic.medium.com/introducing-anthic-640b39e40688) that combines liquidity from multiple sources. Market makers stream real-time pricing (https://youtu.be/6YEASF6diOQ) that accurately reflects current market conditions across all venues where they operate, ensuring users receive optimal pricing while maintaining the security of decentralized trading. - DEX Integration: Anthic integrates with existing Radix DEXs (https://anthic.medium.com/introducing-anthic-640b39e40688) such as Ociswap, CaviarNine, and Astrolescent, allowing them to combine their native AMM liquidity with Anthic's aggregated liquidity. - Front-Running Protection: The platform's intent-based architecture (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) prevents front-running through a unique price streaming mechanism. Unlike traditional AMMs (https://youtu.be/6YEASF6diOQ) where orders can be seen and front-run, or auction-based systems that require long waiting periods, Anthic requires market makers to commit to prices before seeing specific trades, eliminating the possibility of sandwich attacks while maintaining rapid execution. Market Integration Market Maker Partnerships Anthic has established strategic development partnerships (https://anthic.medium.com/anthic-launch-parters-33cb59c28a10) with three major crypto market makers: Keyrock, G-20, and Portofino. Keyrock, supported by industry players including Ripple and SIX Fintech Ventures, operates on over 85 trading venues and collaborates with major platforms like Binance and Kraken. G-20 maintains activity across 60 crypto exchanges and serves hundreds of clients with various token projects. Portofino Technologies provides liquidity on over 100 tokens and holds registrations with the UK FCA and BVI Financial Services Commission. DEX Integration Framework Anthic's integration model (https://docs.anthic.io/integration/dex/) enables existing Radix DEXs to leverage its liquidity alongside their native pools. The platform provides DEXs with an API framework that includes order book access, fee information, and limit order functionality. Initial integrations include Ociswap (Ociswap%20db98af5ef73e4c41b0014f6dc05fb132.md) , CaviarNine (CaviarNine%20aeb1379af8a145209d11aba444339559.md) , and Astrolescent (Astrolescent%206973417ff80440f2a816a1ed2465961e.md) , with trades executed through these platforms benefiting from both their native AMM liquidity and Anthic's aggregated market maker liquidity. Asset Support The platform initially targets support (https://anthic.medium.com/introducing-anthic-640b39e40688) for at least 25 of the top crypto assets by market capitalization, with plans to expand to the top 100. Through Instabridge, Anthic already supports major assets including BTC, ETH, USDC, and USDT on the Radix network. Revenue Sharing Anthic implements a DEX revenue sharing program (https://anthic.medium.com/anthic-fees-db65f8049f52) where 30% of all trading fees are distributed to integrated DEXs based on their proportional contribution to trading volume. This incentivizes DEX participation and helps grow the broader Radix trading ecosystem. Integration Technology The platform provides comprehensive technical documentation (https://docs.anthic.io/) and SDKs for integration partners. This includes detailed APIs for trade execution, Flash Liquidity provision, and solver operations. DEXs can utilize these tools to offer their users access to Anthic's liquidity while maintaining their own user interface and trading experience. Economic Model Fee Structure Anthic operates on a maker-taker fee model (https://anthic.medium.com/anthic-fees-db65f8049f52) with no maker fees at launch to encourage liquidity depth. Taker fees are set at 0.1% of trade value, with progressive discounts available based on trading volume and $XRD staking levels. Fee Distribution The platform converts 100% of collected fees to $XRD (https://anthic.medium.com/anthic-fees-db65f8049f52) through market purchases, which are then distributed as follows: - 40% is locked away and removed from circulating $XRD supply. - 30% is distributed to integrated DEXs proportional to their trading volume. - 30% supports Anthic's development, marketing, and operations. User Incentives The platform implements a tiered fee structure (https://anthic.medium.com/anthic-fees-db65f8049f52) with six levels, from Standard to VIP 5. Users can qualify for reduced fees based on: - 30-day trading volume ($10,000 to $10,000,000+). - $XRD staking amount (75,000 to 3,000,000 LSU). - Taker fees decrease from 0.10% at the Standard tier to 0.05% at VIP 5. DEX Revenue Share Integrated DEXs receive (https://anthic.medium.com/anthic-fees-db65f8049f52) a share of trading fees proportional to the volume they contribute. To qualify, DEXs must register with Anthic and receive an authenticated API. This incentivizes DEX participation and helps grow the Radix trading ecosystem. Settlement Fees The platform charges two types of settlement fees (https://docs.anthic.io/concepts/fees) : - A solver fee paid to transaction bundlers. - A transaction execution fee covering network costs. - A portion of the transaction execution fee may be rebated to users if the actual execution cost is lower than the initial charge. Development Status Testnet Launch Anthic launched on the Radix public testnet (https://anthic.medium.com/anthic-live-on-testnet-8e5f416fd1ae) in December 2024, featuring a complete end-to-end implementation. The testnet deployment includes live price feeds from market maker partners, test trading capability using flash liquidity, and test solver bots matching orders in real-time conditions. Developer Resources The platform has released comprehensive technical documentation (https://docs.anthic.io/) covering integration patterns for DEXs, market makers, and solvers. A developer SDK wrapping the Anthic API is available on GitHub, providing tools for building against the platform's core functionality. Integration Progress Market maker integration (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) testing is underway, with partners including Keyrock, G-20, and Portofino actively connecting to the system. The platform has successfully demonstrated order fills and price streaming through its test deployment. Current Features The testnet implementation includes: - Live indicative price streams (https://anthic.medium.com/anthic-live-on-testnet-8e5f416fd1ae) from high liquidity venues. - Aggregated pricing from multiple market makers. - Functional test APIs for integration testing. - Live solver bundling and transaction submission. - Test deployment of Instamint integration. Mainnet Timeline Anthic targets a Q1 2025 mainnet launch (https://anthic.medium.com/introducing-anthic-640b39e40688) with initial support for at least 25 top crypto assets. The platform plans to expand coverage to the top 100 assets and beyond post-launch, while maintaining its focus on deep liquidity and efficient price execution. Technical Architecture Order Messaging System The Order Messaging System (https://docs.anthic.io/concepts/subintents) functions as Anthic's core infrastructure, coordinating liquidity and order flow between DEXs, liquidity providers, and DeFi applications. It maintains a Subintent Pool that collects and matches signed trade intentions before on-chain settlement. The system offers Web2-style API integration familiar to market makers, closely resembling Request for Stream (RFS) models used in centralized exchanges. Solver Network Solvers play a critical role (https://docs.anthic.io/integration/solver) in the Anthic ecosystem, receiving wrapped versions of limit order subintents from both DEXs and market makers. They bundle coincident subintents into valid transactions and submit them to the Radix network. Solvers receive both a flat fee in $XRD and a transaction execution fee for successful submissions. Market Maker Integration The platform employs a sophisticated Flash Liquidity system (https://anthic.medium.com/flash-liquidity-12a566e30498) for market maker integration, featuring: - Real-time price streaming infrastructure. - Multi-session handling capability. - Authenticated API access. - Fill reconciliation mechanisms. - Post-trade settlement processes. DEX Aggregation The DEX integration framework (https://docs.anthic.io/integration/dex/) enables existing Radix DEXs to combine their native AMM liquidity with Anthic's aggregated market maker liquidity. This includes: - Order book access API. - Fee computation systems. - Limit order submission. - Price aggregation mechanisms. Security Architecture The system implements (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) multiple security layers: - Cryptographic validation of subintents. - Time-bound execution windows. - Slippage protection mechanisms. - Front-running prevention through price commitment. - Non-custodial trading workflows. Infrastructure Components The platform utilizes several key infrastructure components (https://anthic.medium.com/anthic-deep-dive-ca670e8d5235) : - Instamint for on-demand asset minting. - Instabridge for cross-chain asset bridging. - Subintent management system. - Price feed aggregation. - Transaction construction and validation. ## AlphaDEX URL: https://radix.wiki/ecosystem/alphadex Updated: 2026-02-06 Summary: AlphaDEX is a decentralized order book exchange being built on Radix. The platform was founded by Fred Liebenberg and was part of the Radix Grants Program Cohor AlphaDEX is a decentralized order book exchange being built on Radix. The platform was founded by Fred Liebenberg (https://www.notion.so/Fred-Liebenberg-ca5e5863813b41afbfec3fd44b4495af?pvs=21) and was part of the Radix Grants Program Cohort 1 (https://www.radixdlt.com/blog/radix-grants-program-cohort-1-update-1) . The platform is set to launch on Babylon Mainnet in 2023. https://youtu.be/NkhFqpefWlA (https://youtu.be/NkhFqpefWlA) History Liebenberg first became interested in building traditional finance products on a decentralized network upon recognizing the potential and value of Web3. However, he acknowledged that Web3 could not compete with the simplicity of use that traditional finance offered to millions of daily users worldwide. After discovering Radix in 2018, Liebenberg decided to combine his love for programming and knowledge of the financial world to rebuild traditional financial products on Radix. In 2022, he quit his job to become a full-time Radix developer. In March 2023, Liebenburg participated in a Twitter space about AlphaDEX (https://twitter.com/i/spaces/1DXxyvypBBvKM) . Features AlphaDEX allows users to trade on an order book exchange in a non-custodial way and offers advanced trading strategies not possible or easy to implement on automated market maker (AMM) exchanges. The majority of fees earned on the exchange, up to 90%, are returned to users who contribute to its success. The platform is designed for other builders to build decentralized applications (dApps) on and share in the liquidity and efficiency generated by their combined users. AlphaDEX consists of three layers: - Direct on-ledger interaction with Scrypto components - A REST API and WebSocket server - A JavaScript SDK AlphaDEX ensures low fees to maintain efficiency, with a max trading fee of 0.5% that automatically reduces the higher the trading volume on a pair. Liquidity providers share in the majority of the fee earned when their order is matched. https://twitter.com/cryptocoinfi/status/1678146989303771137?s=20 (https://twitter.com/cryptocoinfi/status/1678146989303771137?s=20) Founders Fred Liebenberg (https://www.notion.so/Fred-Liebenberg-ca5e5863813b41afbfec3fd44b4495af?pvs=21) , a South Africa-based finance professional with 25 years of experience in traditional finance, founded AlphaDEX. Liebenberg has a strong background in designing derivative products, managing institutional investments, and analyzing portfolio manager strategies. Towards the Storm, Episode 9: AlphaDEX featuring Fred Liebenberg. (https://youtu.be/Ov5q1C_8hNw?si=OeTN-If1cHLfzJp5) Towards the Storm, Episode 9: AlphaDEX featuring Fred Liebenberg. Inspired by the potential of Web3 and programmable money through smart contracts, Liebenberg aims to bring the simplicity and security of traditional finance products to Web3 and empower the world to leverage the real opportunity that decentralized finance (DeFi) has to offer. ## Allnodes URL: https://radix.wiki/ecosystem/allnodes Updated: 2026-02-06 Summary: Allnodes is a non-custodial platform that offers hosting services for various types of nodes, including Masternodes, Validator Nodes, Super Nodes, Sentry Nodes, Allnodes is a non-custodial platform that offers hosting services for various types of nodes, including Masternodes, Validator Nodes, Super Nodes, Sentry Nodes, Full Nodes, and also supports staking in over 70 protocols. They provide reliable infrastructure for users to participate in blockchain networks and earn rewards. Allnodes is a trusted partner for Radix, providing services related to hosting masternodes, validator nodes, super nodes, sentry nodes, full nodes, and staking. Allnodes offers a non-custodial platform that allows users to easily host nodes, stake coins, and monitor blockchain addresses. They provide a secure and automated process with competitive prices and even offer a free trial. Allnodes supports Radix (XRD) staking, and they provide instructions on how to stake Radix on their platform Overview Allnodes is a trusted partner for Radix, providing services related to hosting masternodes, validator nodes, super nodes, sentry nodes, full nodes, and staking[1]. Allnodes offers a non-custodial platform that allows users to easily host nodes, stake coins, and monitor blockchain addresses. They provide a secure and automated process with competitive prices and even offer a free trial. Allnodes supports Radix (XRD) staking, and they provide instructions on how to stake Radix on their platform Mission Allnodes' mission as a Radix Blockchain partner is to empower users to easily and securely participate in the Radix ecosystem. They aim to simplify the process of running nodes and staking for Radix (XRD) by providing a user-friendly platform with robust security measures. Allnodes takes a non-custodial approach to hosting and staking, meaning users retain full control over their assets while leveraging their platform's infrastructure to support the Radix network. They achieve this by automating the setup and maintenance of nodes and providing a straightforward interface for managing them. Overall, Allnodes' main goal is to help users join and contribute to the Radix network without the technical overhead. They strive to provide a reliable and trustworthy service to support the growth of the Radix ecosystem and make it accessible to a wider audience. Product and Services AllNodes is a platform that offers a range of products and services related to hosting and running masternodes and staking nodes. Here are some of the key products and services offered by AllNodes: Masternode Hosting AllNodes offers masternode hosting on a variety of networks. This includes hosting services for popular masternode coins like Dash, PIVX, and Zcoin. Users can set up and run a masternode on the AllNodes platform, which provides a reliable and secure hosting environment. Shared Masternode Hosting In addition to dedicated masternode hosting, AllNodes also offers shared masternode hosting for some coins. This allows users to pool their resources and run a masternode together, which can be a more affordable option with lower barriers to entry. Staking Node Hosting AllNodes also offers hosting services for staking nodes on a variety of networks. This includes popular staking coins like Cosmos, Tezos, and VeChain. Hosting your staking node on AllNodes provides a secure and reliable environment for staking activities. Dedicated Servers For users who require more customization, AllNodes offers dedicated server hosting. This allows users to set up and configure their own server environment, supporting a wider range of use cases beyond masternode and staking hosting. Node Monitoring AllNodes provides comprehensive node monitoring tools, which help users keep an eye on the health and performance of their masternodes and staking nodes. This includes real-time monitoring, alerts, and other diagnostics and troubleshooting tools. Benefits Joining Allnodes offers several benefits, including: Node Hosting Allnodes offers a secure non-custodial hosting service for various types of nodes, including Masternodes, Validator Nodes, Super Nodes, Sentry Nodes, Full Nodes. This service enables users to participate in blockchain networks and earn rewards without worrying about server maintenance or setup. Easy to Use The Allnodes platform is user-friendly, and users can easily launch a node within a few minutes by following simple instructions. Automatic Updates Allnodes ensures that users nodes are up-to-date with the required software updates, ensuring that the nodes are secure and free from vulnerabilities. Competitive Fees Allnodes charges competitive fees for its hosting services, and users can choose to pay in over 30 cryptocurrencies, including BTC, ETH, LTC, and more. High Uptime Allnodes provides 99.99% uptime for its node hosting services, ensuring that users' nodes are accessible and earning rewards when connected to the blockchain network Excellent Customer Support Allnodes provides excellent customer support to its users, with 24/7 availability through their support desk. They also have live chat support where users can ask any questions they may have about their hosting services. Wide Range of Blockchains Allnodes supports staking and hosting services in over 70 protocols, including notable protocols such as Ethereum, Cardano, Cosmos, Avalanche, and more. In summary, joining Allnodes offers users a secure and reliable platform to participate in various blockchain networks and earn rewards without worrying about server maintenance or setup. The platform is easy to use, and users can access competitive fees, high uptime, and excellent customer support. Sources: Allnodes Review - cult of money (https://www.cultofmoney.com/allnodes-review/) About Allnodes - Allnodes (https://www.allnodes.com/about) Security and Trust Allnodes is considered a trusted partner in the Radix ecosystem and is known for providing secure hosting and staking services. They prioritize the security and reliability of their platform to ensure the safety of user funds and data. Allnodes utilizes advanced security measures to protect user assets and information. They implement encryption protocols, secure connections, and follow best practices for server and infrastructure security. Additionally, they employ monitoring systems and automated backups to ensure data integrity. As a non-custodial platform, Allnodes does not require users to transfer their funds to their control. Instead, users retain full control over their Radix coins, while Allnodes facilitates the hosting and staking process. ## Academia Scrypto URL: https://radix.wiki/ecosystem/academia-scrypto Updated: 2026-02-06 Summary: Academia Scrypto is an educational platform focused on teaching the Scrypto programming language, which is aimed at asset-oriented and open-source development. Academia Scrypto is an educational platform focused on teaching the Scrypto programming language, which is aimed at asset-oriented and open-source development. Developed by RadixDLT, Scrypto is based on Rust and is designed to make it easier for programmers to create smart contracts on the Radix network. Academia Scrypto offers free courses and resources in multiple languages, including Spanish and Italian, to enable learners to develop decentralized applications (dApps). Overview What is Scrypto? Scrypto is an open-source, asset-oriented programming language developed by RadixDLT. It borrows heavily from the Rust programming language and adds new features, specialized syntax, and data types. Scrypto aims to offer developers an easier and more secure way to create smart contracts within the Radix network. Why Learn Scrypto? The Academia Scrypto platform outlines several reasons for learning Scrypto: - Ease of Learning: Scrypto has a lighter learning curve compared to other domain-specific languages for smart contracts. - Security: The language is designed to be predictable and to minimize costly errors, thanks to certain rules imposed by the Finite State Machine (FSM) platform on which it operates. - Reusability: Scrypto allows for the use of existing code, which is both secure and debugged. - Composability: In Scrypto, smart contracts—referred to as 'Components'—can interact with each other like building blocks. Additional motivations include growing job opportunities in the field, the scalable architecture of RadixDLT, and the possibility to innovate in the expanding world of decentralized finance (DeFi). Academia Scrypto Version 0.8 Vision The academy aims to train hundreds of people in the development of smart contracts in the coming months. It seeks to be inclusive, requiring no prior experience in programming or blockchain technology from its students. New content is added weekly, enabling a progressive learning path. Enrollment There is no formal enrollment process. Learners can follow the content published on the academy's website and YouTube channel. A Scrypto developer community also exists on Twitter for Spanish-speaking enthusiasts. Donations Supporters can assist the project in multiple ways, including delegating XRD tokens, creating educational content, or contributing to the academy's repository. ## Abandoned Arena URL: https://radix.wiki/ecosystem/abandoned-arena Updated: 2026-02-06 Summary: Abandoned Arena is an upcoming competitive game that will integrate with non-fungible tokens (NFTs) on the Radix DLT[1]. Players will be able to use Abandoned S Abandoned Arena is an upcoming competitive game that will integrate with non-fungible tokens (NFTs) on the Radix DLT[1]. Players will be able to use Abandoned Scorpions NFTs to compete with others, with other NFTs being supported in the future. Overview Abandoned Arena is built on the Radix network, leveraging its decentralized infrastructure and NFT capabilities. With Radix as the underlying technology, Abandoned Arena benefits from the scalability, security, and consensus mechanisms provided by the Radix network. In this game, players can use Abandoned Scorpions NFTs to compete with others. Abandoned Scorpions are a specific type of NFT associated with the game. The game allows players to bet a certain amount of XRD tokens (the native token on the Radix network) while playing, with a minimum bet requirement. Source: Abandoned Arena (https://abandonedarena.com/) Mission Abandoned Arena's mission is to offer a competitive gaming platform that utilizes non-fungible tokens (NFTs) and the Radix DLT (Decentralized Ledger Technology) network, creating a new dimension in the world of online gaming. And their vision is to revolutionize the gaming industry and reshape it using blockchain gaming to create a new dimension in the esports industry. Abandoned Arena aims to provide players with an immersive and engaging gaming experience that is transparent, secure, and efficient. The platform is built on the principles of decentralization, offering players the opportunity to compete in a fair and transparent manner. Product and Service Abandoned Arena is an upcoming competitive gaming platform that utilizes non-fungible tokens (NFTs) and the Radix DLT (Decentralized Ledger Technology) network. Here are some key aspects of the product and services associated with Abandoned Arena: Competitive Gameplay Abandoned Arena offers competitive gameplay where players can engage in battles using NFTs. Players will be able to compete against each other and demonstrate their skills to win matches. Non-Fungible Tokens (NFTs) The game integrates NFTs, which are unique and indivisible digital assets. Abandoned Arena will feature specific NFTs called "Abandoned Scorpions" that players can collect, own, and use within the game. XRD Cryptocurrency Abandoned Arena employs the Radix cryptocurrency known as $XRD. Players will use $XRD to place bets and participate in the game. Rewards and Achievements The game will offer a rewards system where players can earn and win various rewards such as collectible NFTs, $XRD, special in-game items, and unlockable features. Radix DLT Network Abandoned Arena is built on the Radix DLT network, a decentralized and scalable platform for developing and deploying dApps. The Radix DLT network provides secure and efficient transactions, making it suitable for gaming applications Rewards As per the official website of Abandoned Arena, players will be able to earn and win various rewards while playing the game. These rewards will include: Collectible NFTs Players can win and collect non-fungible token (NFT) game items that will showcase their achievements and progression in the game. $XRD Players will be able to earn $XRD by competing in the game. Special game items Players can win special weapons, armors, and other in-game items by competing and winning matches. Unique game features The game will offer unique features that players can unlock by playing the game and achieve a higher rank. Token Economics Abandoned Arena game integrates with non-fungible tokens (NFTs) on the Radix DLT (Decentralized Ledger Technology) platform. The specific token used in the game is XRD (Radix cryptocurrency). Players will need to use XRD to place bets and participate in the game. Additionally, players will be able to obtain and use Abandoned Scorpions NFTs within the game. These NFTs may represent unique characters, items, or other assets. When playing the game, you bet a certain amount of $XRD, with a minimum bet. Battles on the network will have a fee. (10%, subject to change or adjustments with price fluctuation) The winner will receive the sum of both bets, minus the fee, e.g. for a 100 $XRD bet, 200 * 0.9 = 180 $XRD goes to the winner. The fee goes entirely to $ARENA holders (e.g, hold 10% of supply, earn 2 $XRD from the aforementioned battle) Networks Abandoned Arena game is built on the Radix DLT (Decentralized Ledger Technology) platform, which is a decentralized network for building, deploying, and scaling decentralized applications (dApps). Radix DLT is designed to be a highly scalable and efficient protocol that can handle a large number of transactions per second (TPS) with no competition for resources, which makes it suitable for gaming and other high-throughput applications. It also provides decentralized trust and security properties, which are crucial for gaming platforms. ## 3Syde URL: https://radix.wiki/ecosystem/3syde Updated: 2026-02-06 Summary: 3Syde is a decentralized platform designed to enhance the user experience and navigation within the Radix ecosystem. It provides a suite of tools and features a 3Syde is a decentralized platform designed to enhance the user experience and navigation (https://3syde.gitbook.io/3syde) within the Radix ecosystem. It provides a suite of tools and features aimed at simplifying trading, social interaction, and asset management for users interacting with the Radix network. Platform 3Syde's platform offers several key features (https://3syde.gitbook.io/3syde/overview/what-we-do) and services to enhance the user experience within the Radix ecosystem: Virtual Wallets Users can create and manage virtual wallets directly through the 3Syde platform, eliminating the need for separate wallet applications. These virtual wallets streamline crypto dealings and transactions across devices. Trade Bots 3Syde provides advanced trade bots to automate trading strategies and optimize outcomes: - Buy/Sell Bot: This intelligent bot automates the buying and selling of tokens based on predefined criteria set by the user. It also includes a liquidity pool feature for purchasing newly launched tokens. - Binguard Bot (Planned): 3Syde is developing a specialized bot called the Binguard Bot, aimed at providing advanced trading capabilities (details to be announced). Social Trading 3Syde facilitates social trading and community engagement through the following features (https://3syde.gitbook.io/3syde) : - Follow and Copy Trading: Users can mirror the trading strategies of successful traders on the platform, allowing beginners to learn from experienced traders and experienced traders to diversify their approaches. - Leaderboard: A real-time leaderboard ranks traders based on their performance, consistency, and profitability, helping users identify top traders easily[2]. - Community-Driven Portfolios: Users can participate in portfolios curated based on the collective insights and voting of the community, potentially increasing returns and mitigating risks through a diversified approach. Security 3Syde places a strong emphasis on security and privacy when it comes to managing users' cryptographic keys and digital assets. The platform employs several security measures (https://3syde.gitbook.io/3syde/overview/security) to ensure the safety of user funds: Mnemonic Phrase Generation Upon registration, 3Syde generates a mnemonic phrase (a sequence of words) for each user in a deterministic fashion. This mnemonic phrase serves as the source from which the user's private key is derived. Private Key Derivation Using the mnemonic phrase, 3Syde derives the user's private key, which is then used to generate a master key. Master Key Encryption The master key, central to the user's cryptographic activities, is encrypted using Advanced Encryption Standard (AES) technology. The master key is encrypted directly on the user's device using their password as a seed, providing robust protection against unauthorized access. Ed25519 Algorithm for Key Derivation From the encrypted master key, 3Syde employs the Ed25519 algorithm, known for its strong security and efficiency in generating digital signatures, to derive subsequent keys used for creating and managing the user's wallets. Ed25519 utilizes elliptic curve cryptography, optimizing the process of generating secure and fast signatures for transactions. Client-Side Cryptographic Operations Importantly, all cryptographic operations, from mnemonic phrase generation to key derivation, occur client-side on the user's device. No sensitive data, such as the mnemonic, private key, or master key, is ever transmitted or stored externally, ensuring maximum security and privacy of the user's digital assets. Roadmap 3Syde has outlined a roadmap for the development and release of its platform, with the initial focus being on beta testing phases: Version 0.5 (https://3syde.gitbook.io/3syde/roadmap/0.5-beta) (Stokenet Beta Testing) In this initial beta testing phase, a limited number of 10 beta testers will have access to the first usable version of the 3Syde platform. Key features (https://3syde.gitbook.io/3syde/roadmap) and goals for this version include: - Creating and testing virtual wallets - Testing the Buy/Sell Bot and a Trade Bot for real-time ecosystem trading - Testing real-time notification systems (e.g., notifications for bot activities) - Identifying and addressing bugs and inconsistencies in wallet creation, management, and recovery - Testing social trading features (public wallets, leaderboard, copy trading) - Testing the community ranking system for users to follow and copy Version 0.6 (https://3syde.gitbook.io/3syde/roadmap/0.6-beta) (Stokenet Beta Testing) Building upon the previous version, an additional 15 beta testers will be invited to join the testing phase for Version 0.6. This version will focus on bug fixes and improvements from Version 0.5, as well as introducing new features (https://3syde.gitbook.io/3syde/roadmap/0.6-beta) such as: - Community-driven portfolios - Group creation functionality - Rewards system (recognizing active community users with platform rewards and prominence) By conducting these beta testing phases, 3Syde aims to thoroughly test and refine its platform, identify and address potential issues, and gather user feedback before a wider public release. Tokenomics 3Syde has introduced its native $3SYDE token, with a total supply of 1 billion tokens. The token allocation and distribution (https://3syde.gitbook.io/3syde/governance/tokenomics) are as follows: - Pre-sale: 41% of the total supply (410 million tokens) - Token pool: 35% of the total supply (340 million tokens) - Team and Developers: 6% of the total supply (60 million tokens), with 60% of this reserve having a vesting period of 3 years - EDGE CLUB Holder Airdrop: 3% (30 million tokens) - Surprise airdrop: 1% (details to be announced on Discord) - Reward system (for bot users and community recognition): 5% of the total supply (50 million tokens) - year 1 - Token Staking Rewards: 5% of the total supply (50 million tokens) - year 1 - Marketing/Promotions: 4% of the total supply (40 million tokens) Liquidity To ensure community trust and transparency, 100% of the funds raised during the pre-sale will be added to liquidity (https://3syde.gitbook.io/3syde/governance/tokenomics/liquidity) , with $EDG Club holding all pre-sale investors' capital. Transparency 3Syde has implemented various measures (https://3syde.gitbook.io/3syde/governance/tokenomics/transparency) to promote transparency and utility for the $3SYDE token: - Token holders will have access to premium features of the 3Syde platform - Active users will receive periodic airdrops of the token as an incentive - A buyback system will use 80% of the fees collected from dApp usage to buy back tokens from the market - Token staking will be launched immediately, with 5% of the supply locked in, providing an incentive for pre-sale investors to hold their tokens[3] Pre-sale 3Syde has outlined details for the pre-sale (https://3syde.gitbook.io/3syde/governance/pre-sale) of the $3SYDE token, providing various options for potential investors to secure their spot in the initial investment round: - Pre-sale window: 24 hours - Date: To be announced (TBA) - Pre-sale spots: 300 - Minimum/Maximum bid range: 700-10,000 XRD per spot (maximum of 40,000 XRD for 4 spots) - Pre-sale price: 0.001372 XRD per $3SYDE - Launch price: 0.001607 XRD per $3SYDE - Minimum funds required for launch: 562,500 XRD (if not reached, pre-sale will extend for another 24 hours) Participation Options - Raffle via Staking - Spots Available: 5 - Requirements: Stake a minimum of 50,000 XRD on the EDGE CLUB validator (each 50,000 XRD grants one raffle entry) - Snapshot and Raffle Dates: June 4 (details TBA) - Raffle via $EDG Token - Spots Available: 46 (subject to increase) - Requirements: Send 50 million $EDG per entry to a designated account - Raffle Date: June 6 at 2 PM EST - Special Spot (Entity TBA) - Spots Available: 1 - Details to be announced - Special Spot (Entity TBA) - Spots Available: 1 - Details to be announced Governance While specific details about 3Syde's governance model are not provided in the available information, some insights can be gathered from the tokenomics and pre-sale sections: Decision Making Process - The allocation of tokens for the team and developers (6% of the total supply) suggests that this group plays a significant role in the decision-making process and governance of the project, at least initially. - The vesting period of 3 years for 60% of the team and developer tokens indicates a long-term commitment and incentive alignment with the success of the project. Community Involvement - The inclusion of community-driven portfolios as a planned feature highlights the intention to involve the community in decision-making processes related to portfolio curation and trading strategies. - The rewards system, which recognizes active community users and provides them with platform rewards and prominence, incentivizes community participation and engagement. - The surprise airdrop (1% of the total supply) and the allocation for token airdrops to active users suggest a mechanism to incentivize and reward community members. ## $EARLY URL: https://radix.wiki/ecosystem/early Updated: 2026-02-06 Summary: $EARLY is a cryptocurrency token operating on the Radix distributed ledger technology (DLT). It is primarily designed for use within the Radix ecosystem, with a $EARLY is a cryptocurrency token operating on the Radix distributed ledger technology (DLT) (https://stillearly.today/) . It is primarily designed for use within the Radix ecosystem, with a focus on providing utility for early adopters and enthusiasts. The token's name and branding revolve around the concept of being "early" to a potentially transformative technology, which is a commonly shared feeling in the Radix community (https://stillearly.today/) . Overview As of October 5, 2024, $EARLY has a market capitalization of approximately US$1.1 million and a 7-day trading volume of US$53.6k (https://stillearly.today/) . The token's price is measured in XRD (the native currency of the Radix network), with a value of 0.0681 XRD per $EARLY. The token has a 100% circulating supply (https://stillearly.today/) , indicating that all minted tokens are available in the market. $EARLY is associated with a suite of technologies and services collectively known as earlyTech (https://stillearly.today/) . These include trading tools like earlyBot, information services such as earlyAlerts and earlyGPT, and community engagement features like earlyIntern. A significant component of the $EARLY ecosystem is earlyStudios (https://stillearly.today/studios/about) , a platform for distributing unique art and launching experimental projects for $EARLY token holders. The token's community emphasizes the idea of being "early" to the Radix technology, with the tagline "You're still $EARLY ...but not for long" (https://stillearly.today/) frequently used in their messaging. This philosophy underpins much of the token's branding and community culture. It's important to note that while $EARLY operates within the Radix ecosystem, it is a separate entity from Radix (XRD) itself. As with all cryptocurrencies, potential users and investors should be aware of the volatility and risks associated with such digital assets. History The precise launch date and detailed history of $EARLY are not explicitly stated in the provided documentation. However, we can infer some information about its development and context within the Radix ecosystem. $EARLY is closely associated with the Radix distributed ledger technology (DLT), which forms the foundation of its operation. The token appears to have been created to serve the Radix community (https://stillearly.today/) , particularly focusing on early adopters and enthusiasts of the Radix technology. The development of $EARLY seems to have been accompanied by the creation of several associated technologies and services, collectively known as earlyTech (https://stillearly.today/) . These include trading tools, information services, and community engagement features, suggesting a planned ecosystem around the token. A significant milestone in the history of $EARLY was the launch of earlyStudios (https://stillearly.today/studios/about) , a platform for distributing unique art and launching experimental projects for $EARLY token holders. The exact date of this launch is not provided in the available documentation. The token's history is closely tied to its community, which has developed a unique culture around the concept of being "early" to a potentially transformative technology. This is reflected in the token's name and the tagline "You're still $EARLY ...but not for long" (https://stillearly.today/) , which has become a central theme in the token's messaging and community engagement. As of October 5, 2024, $EARLY has established itself within the Radix ecosystem, with a market capitalization of approximately US$1.1 million (https://stillearly.today/) . However, it's important to note that this represents a snapshot in time, and the token's value and market presence may have fluctuated significantly since its inception. The history of $EARLY also includes the development of the Studio Pass system (https://stillearly.today/studios/studio-pass) , which was implemented to manage access to earlyStudios drops. The initial distribution of these passes involved an airdrop to holders of 50k+ $EARLY tokens, marking a significant event in the token's history. It's worth noting that the full history of $EARLY, including its inception, major developmental milestones, and any significant events or changes in its ecosystem, would require additional sources beyond the provided documentation for a comprehensive account. Technology The technology behind $EARLY is closely tied to the Radix distributed ledger technology (DLT) and a suite of associated tools and services known as earlyTech. This section outlines the key technological components of the $EARLY ecosystem. earlyTech earlyTech refers to a suite of technological tools and services associated with $EARLY. These include: - earlyBot: This is described as enabling "#1 Fastest trading on Radix" (https://stillearly.today/) . earlyBot allows users to "Trade anywhere, anytime" (https://stillearly.today/) . As of the documentation date, 6.66m XRD had already been traded (https://stillearly.today/) using earlyBot. - earlyAlerts: This tool is designed to help users "Snipe new pools, Spot the rugs, Avoid DEX lag" (https://stillearly.today/) . It appears to be an information service to assist traders in making timely decisions. - earlyIntern: This feature allows users to "Tip other users" (https://stillearly.today/) and supports "20+ supported tokens" (https://stillearly.today/) . It also enables users to "Earn for content" (https://stillearly.today/) , suggesting a content creation and reward system. - earlyGPT: Described as a "propetieteary(?) algorithm LLM chat experience" (https://stillearly.today/) , earlyGPT appears to be an AI-powered chat interface, though the exact capabilities are not detailed. - Studios: This is a "Creative launch pad and drop house" (https://stillearly.today/) within the $EARLY ecosystem. It includes features like "Shitcoin Russian Roulette" and "Shitcoin Transformer" (https://stillearly.today/) , though the specifics of these features are not explained in the documentation. Studio Pass Technology The Studio Pass (https://stillearly.today/studios/studio-pass) is a key technological component of the $EARLY ecosystem. It's an NFT (Non-Fungible Token) that gives holders permission to mint particular drops from earlyStudios. The Studio Pass implements dynamic metadata, which is updated after each snapshot to reflect whether the holder met the threshold for a particular drop. Features and Utility $EARLY offers a range of features and utilities within the Radix ecosystem, primarily centered around trading, information services, community engagement, and creative content. These features are largely provided through the earlyTech suite of tools and services. Trading - earlyBot: - Described as the "#1 Fastest trading on Radix" (https://stillearly.today/) - Allows users to "Trade anywhere, anytime" (https://stillearly.today/) - As of the documentation date, 6.66m XRD had been traded using earlyBot (https://stillearly.today/) - Fast Transactions: - The $EARLY ecosystem emphasizes speed, with claims of being the "#1 Fastest trading on Radix" (https://stillearly.today/) Information and Alerts - earlyAlerts: - Designed to help users "Snipe new pools, Spot the rugs, Avoid DEX lag" (https://stillearly.today/) - Provides timely information to assist traders in decision-making - earlyGPT: - Described as a "propetieteary(?) algorithm LLM chat experience" (https://stillearly.today/) - Allows users to "Check if you're still early" (https://stillearly.today/) using an AI-powered chat interface Community Engagement - earlyIntern: - Enables users to "Tip other users" (https://stillearly.today/) - Supports "20+ supported tokens" (https://stillearly.today/) for tipping - Allows users to "Earn for content" (https://stillearly.today/) , suggesting a content creation and reward system - Community Roles: - The $EARLY community has various roles and titles, including "Daddy Shill", "Karma Collector", "Corporate Priest" (https://stillearly.today/) , among others - Community Activities: - Free Taco Tuesdays (https://stillearly.today/) are mentioned as a community event earlyStudios - Creative Platform: - Described as a "Creative launch pad and drop house" (https://stillearly.today/studios/about) - Facilitates the distribution of unique art from curated artists (https://stillearly.today/studios/about) and the launch of experimental projects - Studio Pass: - An NFT that gives holders permission to mint particular drops (https://stillearly.today/studios/studio-pass) from earlyStudios - Features dynamic metadata that updates after each snapshot to reflect eligibility for drops - Artist Collaboration: - earlyStudios provides a platform for artists to feature their work (https://stillearly.today/studios/apply) and create limited edition collections - Collections: - Various types of collections are mentioned, including "Art", "Collectible", and "Experimental" (https://stillearly.today/studios/collections) Experimental Features - Shitcoin Russian Roulette: - Mentioned as a feature, but details are not provided (https://stillearly.today/) - Shitcoin Transformer: - Listed as a feature with a "6.9" rating (https://stillearly.today/) , but specifics are not explained Token Utility - Access to Drops: - Holding a certain amount of $EARLY tokens makes users eligible for various drops and features (https://stillearly.today/studios/about) within the ecosystem - Staking: - While mentioned in the navigation menu (https://stillearly.today/studios/collections) , details about staking mechanisms or rewards are not provided in the available documentation It's important to note that while these features and utilities are described in the $EARLY documentation, the specific mechanisms of how they function or their effectiveness are not detailed. As with any cryptocurrency project, potential users should conduct their own research and consider the associated risks. Token Economics The token economics of $EARLY are characterized by its market metrics, supply dynamics, and utility within the Radix ecosystem. However, it's important to note that the available documentation provides limited information on the comprehensive tokenomics of $EARLY. Market Metrics As of the documentation date, the following market metrics were reported for $EARLY: - Market Capitalization: - US$ 1.1m (https://stillearly.today/) - This figure represents the total market value of the circulating supply of $EARLY tokens. - Trading Volume: - 7d Volume US$ 53.6k (https://stillearly.today/) - This indicates the total value of $EARLY traded over a 7-day period. - Price: - 0.0681 XRD (https://stillearly.today/) - The price of $EARLY is denominated in XRD, the native token of the Radix network. Supply Dynamics The supply characteristics of $EARLY are partially described in the documentation: - Circulating Supply: - 100% (https://stillearly.today/) - This suggests that all minted $EARLY tokens are in circulation and available in the market. - Total Supply: - Not explicitly stated in the provided documentation. - Max Supply: - Not mentioned in the available information. Token Utility and Value Drivers Several factors contribute to the utility and potential value of $EARLY tokens: - Access to earlyStudios Drops: - Holding a certain amount of $EARLY tokens makes users eligible for various drops (https://stillearly.today/studios/about) within the earlyStudios ecosystem. - Studio Pass Eligibility: - The amount of $EARLY held determines eligibility for minting drops (https://stillearly.today/studios/studio-pass) through the Studio Pass system. - Community Engagement: - $EARLY can be used for tipping other users and earning rewards for content creation (https://stillearly.today/) through the earlyIntern system. - Trading Utility: - While not explicitly stated, the existence of earlyBot for fast trading (https://stillearly.today/) suggests that $EARLY may have utility in trading activities within the Radix ecosystem. Token Distribution The documentation does not provide specific information about the initial token distribution, token sale events, or allocation to different stakeholders (e.g., team, advisors, community fund). Staking and Rewards While staking is mentioned in the navigation menu (https://stillearly.today/studios/collections) , the documentation does not provide details about staking mechanisms, rewards, or how staking might affect the token supply and value. Trading Pairs and Liquidity The documentation mentions trading volumes but does not provide specific information about trading pairs or liquidity pools involving $EARLY. It's important to note that the token economics of cryptocurrencies can be complex and subject to change. The information provided here is based on the available documentation at the time of writing. Potential investors and users should conduct their own research and consider the volatility and risks associated with cryptocurrency markets. Community and Culture The $EARLY token has fostered a unique community and culture within the Radix ecosystem, characterized by a shared philosophy, engagement features, and a distinct set of cultural elements. "Still Early" Philosophy The core philosophy of the $EARLY community revolves around the concept of being early adopters of potentially transformative technology: - Tagline: - The community frequently uses the tagline "You're still $EARLY ...but not for long" (https://stillearly.today/) , emphasizing the perceived opportunity in the project. - Community Sentiment: - Being early is described as a "commonly shared feeling in the Radix community" (https://stillearly.today/) and is portrayed as both an opportunity and a source of potential doubt. - Future Outlook: - The community expresses strong belief in the potential of Radix, describing it as "the most promising opportunity to take DeFi mainstream" (https://stillearly.today/) . Community Engagement Features The $EARLY ecosystem includes several features designed to foster community engagement: - earlyIntern: - This system allows users to "Tip other users" (https://stillearly.today/) and "Earn for content" (https://stillearly.today/) , promoting active participation and content creation within the community. - earlyStudios: - Described as a "Creative launch pad and drop house" (https://stillearly.today/studios/about) , this platform encourages community engagement through art and experimental projects. - Studio Pass: - This NFT-based system (https://stillearly.today/studios/studio-pass) creates a sense of exclusivity and encourages continued engagement with the platform. Community Roles and Titles The $EARLY community has established a variety of unique roles and titles for its members, including: - "Daddy Shill" (https://stillearly.today/) - "Tom Newton Quant ॐ" (https://stillearly.today/) - "Robo Cock R&D" (https://stillearly.today/) - "Woopty Tik Tok Jock" (https://stillearly.today/) - "Karma Collector" (https://stillearly.today/) - "Dr Pickle" (https://stillearly.today/) - "Buzz Ambassador" (https://stillearly.today/) - "Soundie Lead Integrator" (https://stillearly.today/) - "Corporate Priest" (https://stillearly.today/) These titles appear to be a mix of humorous and functional roles within the community, although their specific responsibilities or how they are assigned is not detailed in the documentation. Community Activities The community engages in various activities, although details are limited in the provided documentation: - Free Taco Tuesdays: - Mentioned as a regular community event (https://stillearly.today/) , although specifics are not provided. - Content Creation: - The community encourages content creation, with the ability to "Earn for content" (https://stillearly.today/) through the earlyIntern system. - Artist Collaboration: - The community supports artists featuring their work (https://stillearly.today/studios/apply) through earlyStudios. Community Humor and Language The community appears to have developed its own style of humor and language: - Unconventional Spelling: - The use of terms like "earlyAdoptoors" (https://stillearly.today/) suggests a playful approach to language within the community. - Self-Deprecating Humor: - Features like "Shitcoin Russian Roulette" and "Shitcoin Transformer" (https://stillearly.today/) indicate a self-aware, humorous approach to cryptocurrency culture. It's important to note that while these community and cultural elements are mentioned in the $EARLY documentation, the depth of community engagement, the effectiveness of these features, and the overall health of the community are not quantified or independently verified in the available materials. ## bondefi URL: https://radix.wiki/ecosystem/bondefi Updated: 2026-02-06 Summary: BonDeFi is a decentralized crowdfunding platform and token launchpad built on Radix DLT, a distributed ledger technology designed for decentralized finance appl BonDeFi is a decentralized crowdfunding platform and token launchpad built on  Radix DLT (https://www.radixdlt.com/) , a distributed ledger technology designed for decentralized finance applications. The platform positions itself as providing " radically different decentralized crowdfunding (https://bondefi.xyz/) " and aims to offer an alternative fundraising mechanism that bypasses traditional venture capital involvement. Overview The platform's core functionality centres on improving the token launch process through several integrated mechanisms. According to coverage from the  European Blockchain Convention (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights) , BonDeFi features a pre-sale phase with custom pricing using bonding curves, guaranteed pre-sale goals with full refunds if funding targets are not met, automatic decentralized exchange listing at the Token Generation Event with locked liquidity, and immediate token utility upon launch. Trading fees generated from locked liquidity are redistributed to participants through what the platform describes as a "real-yield" staking program. BonDeFi employs a  batched bonding curve model (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  as its primary pricing mechanism. Bonding curves are mathematical formulas that determine token prices based on supply, automatically adjusting prices as tokens are minted or sold. The platform's batched approach is designed to address common problems in token launches, including price volatility and exploitation by automated trading bots. By releasing tokens in batches rather than continuously, the system aims to provide guaranteed liquidity, reduce sudden price swings, and ensure fair access for all participants regardless of their technical capabilities or transaction speed. The platform provides project creators with a  dashboard interface (https://bondefi.xyz/)  for managing funding rounds, while offering investors tools to discover and participate in token pre-sales. BonDeFi describes its approach as creating "a transparent, efficient, and globally accessible funding ecosystem" that serves both builders seeking capital and investors looking for early-stage opportunities in the Radix ecosystem. History BonDeFi emerged as a project from the Scrypto Hackathon held during the  10th edition of the European Blockchain Convention (https://eblockchainconvention.com/hackathon/)  in Barcelona, Spain on 25–26 September 2024. The hackathon, sponsored by Radix DLT, brought together  over 150 developers (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  to build decentralized applications using Scrypto, Radix's Rust-based smart contract programming language. Many participants were writing Scrypto code for the first time during the event. BonDeFi was among  five teams that received awards (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  for their submissions at the conclusion of the hackathon. According to Radix DLT, four of the five winning teams, along with several other hackathon participants, opted to continue developing their projects with support from RDX Works, the primary contributor to the Radix protocol. The convention itself attracted over 6,000 attendees and featured participation from major financial institutions including Santander, ING Bank, Mastercard, Société Générale, and BNP Paribas. Following the hackathon, the BonDeFi team continued development of the platform. By March 2025, the project had published  technical documentation (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  detailing its batched bonding curve model and anti-sniping mechanisms, indicating ongoing work on the platform's core features beyond the initial hackathon prototype. Technology Bonding Curve Mechanism The platform employs bonding curves as its core pricing mechanism for token pre-sales. A bonding curve is a mathematical formula that establishes a  relationship between a token's price and its supply (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc) , automatically adjusting prices as tokens are minted or sold. BonDeFi's documentation describes bonding curves as "revolutionizing DeFi by creating a  dynamic pricing mechanism (https://bondefi.xyz/)  for tokens" where "as more tokens are minted, the price changes along a predetermined curve, incentivizing early adoption and investment." Several mathematical models exist for bonding curves.  Linear bonding curves (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  increase price at a constant rate as supply grows, following the formula P = m⋅S+b, where P represents price, m is the slope, S is the current supply, and b is the initial price.  Exponential bonding curves (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  produce more rapid price appreciation, following the formula P = a⋅e^(b⋅S), where a represents the initial price multiplier and b controls the exponential growth rate. BonDeFi states that its platform allows builders to " incorporate different variables and conditions (https://bondefi.xyz/)  into the price model and create dynamic pricing based on factors like sale speed and demand." BonDeFi specifically implements what it calls a " batched bonding curve model (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc) ," which releases tokens in discrete batches rather than continuously. The platform claims this approach offers several advantages including  reduced volatility (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  through the batched release of tokens,  guaranteed liquidity (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  ensuring investors can always buy or sell, and fair access giving all participants equal opportunity to acquire tokens. Anti-Sniping Innovation A significant technical feature of BonDeFi's design is its approach to preventing token sniping. In the context of decentralized exchanges and token launches, sniping refers to the practice of using automated trading bots to purchase tokens immediately upon listing, often within milliseconds of liquidity being added. These bots monitor blockchain mempools for pending liquidity transactions and execute buy orders faster than human traders can respond, allowing bot operators to acquire tokens at the lowest possible prices before selling at a profit once regular traders enter the market. BonDeFi addresses this problem through its batched bonding curve architecture. According to the project's documentation, the batching mechanism  makes sniping "impossible" (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  by eliminating the first-mover advantage that sniper bots exploit. Rather than allowing continuous purchasing where transaction speed determines who obtains the best prices, the batched system processes orders in groups, theoretically ensuring that  all participants have an equal opportunity (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  to acquire tokens at fair prices regardless of their technical capabilities or access to automated trading infrastructure. The platform positions this anti-sniping feature as a core differentiator, framing it as an " anti-sniping innovation (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc) " that protects both project founders and retail investors from the extractive dynamics that can occur during token launches on other platforms. Features Pre-Sale Phase BonDeFi's pre-sale mechanism allows project creators to conduct token sales prior to public listing through its bonding curve pricing system. According to coverage from  Radix DLT (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights) , the platform "features a pre-sale phase with custom pricing using bonding curves," enabling projects to establish their own pricing parameters rather than setting a fixed token price. The platform's website states that builders can " incorporate different variables and conditions (https://bondefi.xyz/)  into the price model and create dynamic pricing based on factors like sale speed and demand." A distinguishing aspect of BonDeFi's pre-sale design is its  guaranteed goal mechanism with refund protection (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights) . If a project fails to reach its stated funding target, investors receive full refunds and the Token Generation Event does not proceed. This structure aims to protect investors from contributing to projects that fail to attract sufficient community support while ensuring that projects only launch when they have achieved a baseline level of capitalisation. For project creators, BonDeFi provides what it describes as " a single, intuitive dashboard (https://bondefi.xyz/)  to manage your funding rounds." The platform positions itself as enabling users to "start your global micro-funding round easier than ever" without requiring traditional venture capital involvement. For investors, the platform offers discovery tools to identify and evaluate pre-sale opportunities, with the stated goal of ensuring participants " never miss an investment opportunity (https://bondefi.xyz/) ." Token Generation Event The Token Generation Event represents the point at which a successfully funded project's tokens become publicly tradable. On BonDeFi, this process includes  automatic listing on a decentralized exchange (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  once the pre-sale funding goal has been met. This automated approach removes the need for project teams to manually coordinate exchange listings or negotiate with trading venues, theoretically reducing the time between successful fundraising and market availability. A key component of BonDeFi's TGE process is  locked liquidity (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights) . When tokens are listed on the decentralized exchange, a portion of the raised funds is locked as liquidity to facilitate trading. Locking liquidity is a practice in decentralized finance intended to prevent project creators from withdrawing pooled funds immediately after listing, a practice sometimes referred to as a "rug pull." By automating this lock at the protocol level, BonDeFi aims to provide structural assurances to investors that trading liquidity will remain available. The platform also emphasizes  immediate token utility (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  at the point of generation. Rather than tokens existing solely as speculative instruments following their creation, BonDeFi's framework is designed to ensure that tokens can serve functional purposes within their respective project ecosystems from the moment they become available. Real-Yield Staking BonDeFi incorporates a staking programme tied to its locked liquidity mechanism. According to the Radix DLT blog, " trading fees from locked liquidity are redistributed (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  through a 'real-yield' staking program." This means that when users trade tokens on the decentralized exchange where a project has been listed, the transaction fees generated from that trading activity are not retained by the platform or discarded but instead distributed to participants who stake their tokens. The "real-yield" terminology distinguishes this model from staking programmes that distribute newly minted tokens as rewards. In many decentralized finance protocols, staking rewards come from token inflation—new tokens are created and given to stakers, which can dilute the value of existing holdings over time. By contrast, a real-yield model distributes value derived from actual economic activity, in this case trading fees, rather than from expanding the token supply. This approach aims to create sustainable returns tied to genuine platform usage rather than inflationary token emissions. The specific mechanics of BonDeFi's staking programme, including distribution schedules, eligibility requirements, and the proportion of fees allocated to stakers, are not detailed in the available documentation. Use Cases BonDeFi positions itself as a general-purpose decentralized crowdfunding platform designed to serve " a wide range of use cases (https://bondefi.xyz/) ." The platform states that " builders of the future can leverage these customizable curves (https://bondefi.xyz/)  to design innovative tokenomics and funding strategies" across different project types and industries. The primary use case articulated by the platform is enabling token launches for blockchain-based projects. The  Radix DLT blog (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  describes BonDeFi as "a launchpad designed to improve the token launch process," indicating its core function as infrastructure for projects seeking to issue and distribute tokens to investors and community members. BonDeFi explicitly frames itself as an alternative to traditional venture capital funding. The platform's messaging encourages users to " start your global micro-funding round easier than ever - no VCs (https://bondefi.xyz/) ," suggesting a target audience of project creators who wish to raise capital directly from a distributed investor base rather than through institutional funding rounds. This positioning aligns with broader trends in decentralized finance toward disintermediation of traditional financial gatekeepers. The platform's website displays example project categories to illustrate potential use cases, including decentralized finance projects focused on lending and borrowing, blockchain platforms for supply chain management, tokens providing liquidity solutions for decentralized exchanges, projects building decentralized social media platforms with privacy features, and marketplaces for digital art and NFTs. However, these appear as  illustrative examples (https://bondefi.xyz/) demonstrating the breadth of projects the platform could theoretically support rather than documentation of actual projects that have completed fundraising through BonDeFi. As a platform built on the Radix network, BonDeFi's use cases are inherently tied to the Radix ecosystem. Projects launching through BonDeFi would issue tokens compatible with Radix's infrastructure, meaning the platform serves as a mechanism for expanding the range of assets and applications available within the Radix DeFi ecosystem specifically, rather than as a blockchain-agnostic fundraising tool. The available documentation does not provide information about specific projects that have successfully completed fundraising campaigns through the platform, the total value raised through BonDeFi to date, or detailed case studies of how particular project types have utilised the platform's features. Tokenomics The available documentation provides limited information about BonDeFi's platform-level tokenomics. While the platform offers extensive tools for projects to design their own token economics, details about BonDeFi's own economic model remain largely undisclosed in public sources. Project-Level Token Design BonDeFi enables projects launching on the platform to create customised tokenomics through its bonding curve infrastructure. The platform states that " builders of the future can leverage these customizable curves (https://bondefi.xyz/)  to design innovative tokenomics and funding strategies for a wide range of use cases." Projects can " incorporate different variables and conditions (https://bondefi.xyz/)  into the price model and create dynamic pricing based on factors like sale speed and demand," allowing for tailored economic designs rather than requiring adherence to a standardised token structure. The platform emphasises " guaranteed liquidity (https://bondefi.xyz/) " as a core feature, marketing itself as enabling projects to " emit your token on a bonding curve (https://bondefi.xyz/)  and collect equity for your project in a fair, safe and transparent way" while enjoying "guaranteed liquidity" and avoiding "the bad reputation problem of ICOs." This guarantee is implemented through the bonding curve mechanism, which maintains a mathematical relationship between token price and supply, theoretically ensuring that tokens can always be bought or sold at prices determined by the curve. Liquidity Lock Mechanisms At the Token Generation Event, BonDeFi implements  automatic liquidity locking (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  as part of the decentralized exchange listing process. Locking liquidity prevents project creators from immediately withdrawing pooled funds after a token launches, a safeguard against exit scams. The specific parameters of these locks, including duration, percentage of funds locked, and conditions for release, are not detailed in the available documentation. Fee Distribution The platform operates a fee redistribution model in which  trading fees from locked liquidity (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  are channelled through a "real-yield" staking programme. This indicates that the platform generates revenue from trading activity on tokens launched through its infrastructure, with at least a portion of these fees being distributed to staking participants rather than being retained entirely by the platform. The specific fee percentages, the split between platform retention and staker distribution, and the mechanics of the staking programme are not specified in publicly available sources. Undisclosed Information The documentation does not provide information regarding whether BonDeFi operates its own native platform token, the fee structure charged to projects for using the launchpad, specific percentages or durations for liquidity locks, token supply parameters if a platform token exists, allocation or vesting schedules, or governance mechanisms. This lack of publicly documented tokenomics may reflect the project's early stage of development following its emergence from the September 2024 hackathon, or may indicate that such details are shared through other channels not captured in web-accessible sources. Comparison to Other Launchpads Centralized versus Decentralized Launchpads Token launchpads generally fall into two categories: centralized platforms operated by cryptocurrency exchanges, and decentralized platforms built on blockchain infrastructure. Centralized launchpads such as Binance Launchpad operate under regulated exchanges, offering project vetting, Know Your Customer procedures, and access to large existing user bases, but require projects to meet specific listing criteria and often involve significant fees or token allocations to the exchange. BonDeFi operates as a  decentralized crowdfunding platform (https://bondefi.xyz/) , positioning itself as enabling projects to " start your global micro-funding round easier than ever - no VCs (https://bondefi.xyz/) ," suggesting a model that bypasses both traditional venture capital and centralized exchange gatekeepers. The Radix Foundry Program, announced by Radix Publishing, identified launchpads as critical infrastructure for ecosystem growth, describing them as " a combination of dApp and organization (https://www.radixdlt.com/blog/introducing-radix-foundry-program)  that can help projects gain access to liquid capital, navigate the complex legal, compliance, and regulatory aspects of token launches, and support in marketing and distributing the token to thousands of new users." The programme cited  Seedify, DAOMaker, and Decubate (https://www.radixdlt.com/blog/introducing-radix-foundry-program)  as examples of launchpads that have achieved success on other networks, noting "there's an opportunity to be the prime launchpad in the Radix ecosystem." Comparison with Bonding Curve Platforms BonDeFi shares architectural similarities with other bonding curve-based launchpads that have emerged on various blockchain networks.  Pump.fun (https://www.solflare.com/ecosystem/pump-fun-where-memes-meet-markets-on-solana/) , a Solana-based platform launched in January 2024, utilizes a bonding curve mechanism where 800 million of each token's 1 billion supply is placed on the curve, with prices increasing as buyers purchase tokens. When tokens reach a market capitalization threshold, they automatically migrate to a decentralized exchange for continued trading. Both BonDeFi and Pump.fun employ bonding curves to provide guaranteed liquidity and automatic price discovery. However, BonDeFi's documentation emphasises its  batched bonding curve model (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc) , which processes orders in discrete batches rather than continuously. The platform positions this batching mechanism as an  anti-sniping innovation (https://medium.com/@bondefi/bonding-curves-anti-snipe-1cf47220d3bc)  that prevents automated trading bots from exploiting first-mover advantages, a problem that continuous bonding curve platforms can be susceptible to. Another distinguishing feature is BonDeFi's  guaranteed pre-sale goals with refund protection (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights) . If a project fails to reach its stated funding target, the Token Generation Event does not proceed and investors receive refunds. This contrasts with platforms where tokens begin trading immediately upon creation regardless of whether meaningful community support has been established. Position within the Radix Ecosystem BonDeFi operates exclusively on the  Radix DLT network (https://bondefi.xyz/) , making it blockchain-specific rather than chain-agnostic. This positions it differently from multi-chain launchpads such as  PinkSale and DxSale (https://slashdot.org/software/p/PinkSale/alternatives) , which support token launches across numerous blockchain networks including Ethereum, Binance Smart Chain, and Solana. The  Radix Ecosystem Directory (https://www.radixdlt.com/ecosystem-directory)  lists multiple projects offering related functionality, including platforms for token creation with bonding curves and services providing token locking and vesting capabilities. Within this ecosystem, BonDeFi competes for the launchpad category that Radix has identified as strategically important for network growth. The platform's integration with Radix's technical infrastructure means projects launching through BonDeFi benefit from features specific to the Radix network, including  Scrypto's asset-oriented programming model (https://github.com/radixdlt/radixdlt-scrypto)  and the Radix Engine's approach to treating tokens as native platform resources rather than smart contract entries. However, this also means tokens launched through BonDeFi are limited to the Radix ecosystem unless separately bridged to other networks. Documented Differentiators Based on available documentation, BonDeFi emphasises several features as distinguishing it from other launchpad platforms. These include the batched bonding curve architecture designed to prevent sniping, guaranteed funding thresholds with automatic refunds, automatic decentralized exchange listing with  locked liquidity (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  at the Token Generation Event, and  real-yield staking (https://www.radixdlt.com/blog/european-blockchain-convention-event-highlights)  that distributes trading fees rather than inflationary token rewards. The platform also enables projects to design  customisable tokenomics (https://bondefi.xyz/)  with dynamic pricing variables, though the extent to which this differs from competitor offerings would require direct technical comparison not available in public documentation. ## RADIX.wiki URL: https://radix.wiki/ecosystem/radix-wiki Updated: 2026-02-06 RADIX.wiki (http://RADIX.wiki) is a permissionless knowledge base and community hub for the Radix ecosystem. The organization runs events such as the Radix Wiki Hackathon and the Dapp In A Day workshops, while the site hosts around 250 articles, ecosystem listings, a jobs board, and a talent pool section, all with a view to increasing the 'collision rate' between researchers, founders and contributors to Radix. Since 2022, the site has seen more than 10k visitors from over 100 countries. How to contribute There are several ways to contribute to the wiki: - Edit an article by clicking on the Edit button above. - Create a new page under the appropriate tag path (e.g., contents, ecosystem, community). - Ask questions or make suggestions in the comments section. - Enquire about sponsorships. Note: Contributing requires a Radix wallet with XRD. Creating pages requires 5,000–50,000 XRD depending on the section, editing requires 20,000 XRD, and commenting requires 10,000 XRD. Some sections (community, blog, RFPs) are author-only. Advantages - Utility: Knowledge and experience are wasted if they aren't shared, but a wiki can amplify and focus the collective wisdom, experience, and intelligence of the Radix community, benefitting the whole web3 movement. - Accessibility: Being spread across multiple locations makes information difficult and slow to find, but a comprehensive wiki can make it easier for community members to find the information they need, whether they are researching a topic or solving a problem. - Speed: Information on a wiki can be updated within minutes of a new development, giving an immediate benefit to the community. - Accuracy: Multiple users can contribute and edit content on a wiki, leading to a more comprehensive, balanced, accurate, and up-to-date resource that also reflects the diversity of its contributors. - Neutrality: Hype can be off-putting for newcomers, but a neutral voice helps people to view information about Radix objectively. - Simplicity: A rich block-based editor makes it easy for anyone to contribute, no matter how small the change. - Version control: Semantic versioning with block-level diffs tracks every change and makes it easy to review or roll back edits. - Collaboration: Bringing knowledge and creators together in one place increases the collision rate of the community, leading to more projects and shared endeavors that will compound for Radix over the coming weeks and months. Disadvantages - Quality control: Although XRD balance requirements help filter contributions, it can still be difficult to ensure that all information is accurate and reliable. - Vandalism: Balance gating and author-only restrictions mitigate this, but wiki content remains editable by qualifying users. - Ownership: Since the information is contributed by multiple users, there can be issues with ownership and authorship. Author-only sections (community pages, blog, RFPs) help address this for personal content. Roadmap Just as Wikipedia is the municipal center of the internet, our vision is to be the same for Web3, assuming that Radix will be its substrate. Having moved to a fully native web3 application, the wiki now combines the power of Radix with a block-based editing system. Upcoming milestones include: - Community governance and treasury management. - Contributor rewards. - Integrations with other Radix projects. - On-ledger storage. Technical Features Key technical features of the application include: - Decentralized authentication using ROLA (Radix Off-Ledger Authentication) with Ed25519 signature verification. - Block-based content system with six block types: rich text, columns, infoboxes, recent pages, page lists, and asset price widgets. - Rich text editing via TipTap with tables, code blocks (syntax highlighted), YouTube/Twitter embeds, tab groups, and image uploads. - Semantic versioning with block-level change tracking (added, removed, modified, moved). - XRD balance-gated access control for creating, editing, and commenting. - Hierarchical tag-based content organization with per-path permissions. - Threaded comments. - Full revision history with structured diffs. Technical Description - Frontend: Next.js 15 (App Router) with React 19 for server-side rendering and client-side navigation. Tailwind CSS 4 with a custom dark-mode design system. - Database: PostgreSQL via Prisma ORM (hosted on Supabase). - Editor: TipTap 2.10 with custom block extensions. - State management: Zustand 5. - Image storage: Vercel Blob. - Authentication: Radix DApp Toolkit for wallet connection; ROLA challenge-response flow verified against the Radix Gateway API; JWT sessions stored in PostgreSQL. ## SoulStore URL: https://radix.wiki/ecosystem/soulstore Updated: 2026-02-06 Soulstore is a decentralized application (dApp) for virtual accounts (soulstores) that are eternally linked to and controlled by specific non-fungible tokens (NFTs). Soulstores serve as decentralized, soul-bound inventory spaces for NFTs, allowing them to hold and manage various digital assets, including tokens and other NFTs. Features - Permanent Association: Each soulstore is uniquely and permanently associated with a specific NFT, forming an inseparable bond between the two. - Asset Storage: Soulstores can securely store different types of digital assets, such as fungible tokens (e.g., cryptocurrencies) and non-fungible tokens (e.g., other NFTs). - Asset Management: Owners of an NFT can deposit, hold, and withdraw assets to and from its associated soulstore, providing a means for managing and transferring digital assets. - Ownership Control: Only the current owner of an NFT has exclusive access and control over the assets stored within its soulstore. - Transferability: When an NFT is traded or transferred to a new owner, the contents of its soulstore are automatically transferred as well, ensuring that the assets remain bound to the NFT. Use Cases Soulstores introduce a new dimension to the utility and value proposition of NFTs. By allowing NFTs to hold and manage digital assets, soulstores enable various use cases, including: - Tokenized Asset Management: Soulstores provide a way to tokenize and manage physical or digital assets by representing them as NFTs, with their associated soulstores holding related tokens or assets. - In-Game Asset Storage: In the realm of blockchain-based gaming, soulstores can serve as secure inventories for in-game assets, ensuring that players' hard-earned items are permanently bound to their NFT characters or collectibles. - Decentralized Finance (DeFi) Applications: Soulstores can facilitate the creation of new DeFi products and services by allowing NFTs to hold and manage various cryptocurrencies and tokens. - Digital Art and Collectibles: Artists and creators can leverage soulstores to enhance the value and utility of their digital art or collectible NFTs by including additional assets or tokens within the associated soulstores. How Soulstores Work Soulstores are designed to provide a seamless and secure way to manage digital assets associated with non-fungible tokens (NFTs). Here's how the process of creating, depositing, withdrawing, and transferring soulstores works: Creating a Soulstore - Identify the NFT for which you want to create a soulstore. - Visit the soulstore creation platform (e.g., soulstore.app (http://soulstore.app) ) and search for the specific NFT. - If a soulstore hasn't been created for that NFT yet, you can initiate the creation process. - Pay the required network transaction fees (e.g., on the Radix network) to create the soulstore. - Once created, the soulstore is permanently linked to the NFT, and its address is generated. Depositing Assets into a Soulstore - Obtain the soulstore address associated with the NFT you own. - Using a compatible cryptocurrency wallet (e.g., Radix wallet), send the desired digital assets (tokens or NFTs) to the soulstore address, just like sending to any other wallet address. - The assets will be securely stored within the soulstore, bound to the associated NFT. Withdrawing Assets from a Soulstore - Access the soulstore platform and locate the soulstore associated with your NFT. - Authenticate your ownership of the NFT, typically using your private key or wallet credentials. - Select the specific assets you want to withdraw from the soulstore. - Initiate the withdrawal process, specifying the destination address where you want to receive the assets. - Pay the required network transaction fees for the withdrawal. - The assets will be transferred from the soulstore to the specified destination address. Transferring Ownership of a Soulstore When an NFT with an associated soulstore is traded or transferred to a new owner, the ownership of the soulstore and all its contents are automatically transferred as well. This process is handled seamlessly by the underlying blockchain network and smart contracts, ensuring that the assets remain bound to the NFT throughout its lifecycle. It's important to note that once a soulstore is created for an NFT, it cannot be replaced or duplicated. The soulstore remains eternally linked to that specific NFT, regardless of ownership changes or other transitions. The soulstore ecosystem leverages the decentralized nature of blockchain technology, ensuring transparency, security, and immutability in the management and transfer of digital assets associated with NFTs. Technical Details Soulstores are built on top of the Radix blockchain network and leverage several key technical concepts and components. Understanding these details is crucial for developers and users alike to fully grasp the inner workings of soulstores. Non-Fungible Global ID (NFGID) Every NFT minted on the Radix network is assigned a unique Non-Fungible Global ID (NFGID). This identifier is a combination of the NFT's resource address and its specific ID within that resource. The NFGID serves as a globally unique and immutable identifier for each NFT, ensuring that soulstores can be accurately and permanently linked to their respective NFTs. Relationship between NFTs and Soulstores There is a one-to-one relationship between an NFT and its soulstore. Each NFT can have only one associated soulstore, and once a soulstore is created for an NFT, it cannot be replaced or duplicated. This permanent and exclusive binding ensures the integrity and continuity of the soulstore's contents throughout the lifecycle of the NFT. Cost and Fees Associated with Soulstores Creating a soulstore on the Radix network involves paying a one-time network transaction fee. As of writing, the cost to create a soulstore is approximately 0.45 $XRD (Radix's native cryptocurrency). This fee covers the computational resources required to execute the smart contract and establish the soulstore on the blockchain. Once a soulstore is created, there are no recurring fees for holding or managing assets within it. However, depositing or withdrawing assets from a soulstore will incur standard network transaction fees, just like any other blockchain transaction. It's important to note that the transaction fees mentioned above are subject to change based on network conditions and potential updates to the Radix blockchain. Integration with Other Platforms While soulstores are primarily built on the Radix network, they can potentially integrate with other blockchain platforms and technologies through the use of bridges or interoperability protocols. One such integration in development is the XRD Domains project, which aims to enable the use of human-readable domain names (e.g., cakerobo.xrd) as aliases for NFT soulstore addresses. This integration would allow users to send assets directly to an NFT's soulstore using a friendly domain name, rather than the lengthy and complex soulstore address. ## $DAN URL: https://radix.wiki/ecosystem/dan Updated: 2026-02-06 $DAN (https://www.danxrd.xyz/) is a cryptocurrency token created on the Radix blockchain (https://www.radixdlt.com/) . It was initially launched as a memecoin (https://www.danxrd.xyz/) to pay homage to Dan Hughes, the creator of Radix. Despite starting with zero utility (https://www.danxrd.xyz/) , $DAN quickly gained attention within the Radix ecosystem due to its record-breaking token launch. Overview The project has since evolved from its memecoin origins to become what it claims is the first true passive income project on the Radix network (https://www.danxrd.xyz/) . $DAN's primary feature is a staking mechanism that allows token holders to earn $XRD (the native cryptocurrency of the Radix blockchain) as rewards. $DAN's stated mission (https://www.danxrd.xyz/) is twofold: to reward its token holders and to bring attention to the Radix blockchain. The project emphasizes transparency, community engagement, and a "stake and earn" model as its core principles. As of the information provided, $DAN has reported distributing 1 billion tokens (https://www.danxrd.xyz/) through airdrops and payouts to its community members, although the exact timeframe for these distributions is not specified in the source material. History Origin and Creation $DAN was created as a memecoin (https://www.danxrd.xyz/) on the Radix blockchain, with the initial concept centered around paying homage to Dan Hughes, the creator of Radix. The project's founders made no promises (https://www.danxrd.xyz/) beyond a fair launch, creating "bad-ass DAN memes," and implementing what they termed "kick-ass tokenomics." This approach aligns with the typical characteristics of memecoins, which often rely on community engagement and humor rather than technical utility at launch. Record-breaking Token Launch The launch of $DAN marked a significant milestone in the Radix ecosystem. According to the project's website, $DAN broke every token launch record on RADIX (https://www.danxrd.xyz/) . Specifically, the project conducted a 24-hour Initial Coin Offering (ICO) (https://www.danxrd.xyz/) that raised over 1.3 million $XRD. This successful fundraising event also resulted in $DAN achieving the highest liquidity to market cap ratio on the Radix blockchain (https://www.danxrd.xyz/) at the time. Following the successful launch, the $DAN team, with the community's consent (https://www.danxrd.xyz/) , made a strategic decision to withdraw half of the project's liquidity. This capital was repurposed as an investment fund, intended to generate passive income for token holders. In a move towards transparency, the project claims that all investments, transactions, and cross-chain wallet addresses related to this fund are made available to token holders through their Discord server. As the project evolved, it transitioned from a pure memecoin to what it describes as the "FIRST true passive income project on $XRD" (https://www.danxrd.xyz/) ($XRD being the ticker for Radix's native token). This shift in focus towards providing tangible benefits to token holders marked a significant development in $DAN's history, setting it apart from its initial memecoin status. Features and Utility Staking Mechanism The primary feature of $DAN is its staking mechanism (https://www.danxrd.xyz/) , which allows token holders to earn rewards in $XRD, the native cryptocurrency of the Radix blockchain. The project emphasizes the simplicity of this process with their tagline "Stake, Earn, and Degen with $DAN" (https://www.danxrd.xyz/) . According to the project's website, users can stake their $DAN tokens through a dedicated staking portal, though the exact launch date of this platform is not specified in the provided information. $XRD Rewards System $DAN implements a rewards system (https://www.danxrd.xyz/) where staked tokens generate $XRD returns for holders. The project states that these rewards are distributed on a quarterly basis (https://www.danxrd.xyz/) . To be eligible for a payout, users must stake their tokens for a minimum period of 30 days (https://www.danxrd.xyz/) prior to a scheduled distribution. However, the overall minimum staking period is set at 3 months (https://www.danxrd.xyz/) , suggesting a mechanism to encourage longer-term holding. Investment Strategy A unique aspect of $DAN's utility is its investment strategy. The project claims to invest in passive income opportunities across multiple blockchain networks (https://www.danxrd.xyz/) . According to their stated model, 80% of the investment profits are distributed to staked token holders (https://www.danxrd.xyz/) , while the remaining 20% is reinvested into the fund to potentially generate more returns for users. To maintain transparency, the project states that all transactions related to these investments are recorded and made available to token holders through their Discord server (https://www.danxrd.xyz/) . This approach aims to provide a level of accountability and allow community members to track the performance of the investment fund. Additionally, the project mentions plans to integrate an automated trading bot called Kumodroid (https://www.danxrd.xyz/) , which would be offered through the $DAN portal. This feature is intended to provide users with advanced trading capabilities without requiring extensive trading knowledge, though details on its implementation and availability are not specified in the provided information. Tokenomics Token Distribution The $DAN project has implemented a token distribution strategy that includes airdrops and payouts to community members. According to the project's website, the total amount of $DAN distributed through airdrops and payouts to date is 1 billion tokens (https://www.danxrd.xyz/) . However, the provided information does not specify the total supply of $DAN tokens or provide a detailed breakdown of the token allocation. The project emphasizes a community-centric approach (https://www.danxrd.xyz/) in its token distribution, with a focus on rewarding holders through various mechanisms. This includes the staking rewards system, where holders can earn $XRD by staking their $DAN tokens. Liquidity Management $DAN's approach to liquidity management involves several key strategies: - Initial Liquidity Provision: During its launch, $DAN claims to have achieved the highest liquidity to market cap ratio on the Radix blockchain (https://www.danxrd.xyz/) . This suggests a significant portion of the funds raised during the initial coin offering (ICO) was allocated to provide liquidity for trading. - Strategic Liquidity Reallocation: After the launch, the project team, with community consent, withdrew half of the provided liquidity (https://www.danxrd.xyz/) . This capital was repurposed to create an investment fund aimed at generating passive income for token holders. - Liquidity Locking: In a move to build trust and ensure long-term stability, $DAN claims to be the first project to lock its liquidity using RADLOCK (https://www.danxrd.xyz/) when the feature became available. The liquidity pool (LP) tokens are reportedly locked until April 20, 6900 - an extremely distant future date likely chosen for humorous effect. - Transparency Measures: The project states that all investments, transactions, and cross-chain wallet addresses related to the investment fund are made available to token holders (https://www.danxrd.xyz/) through their Discord server, promoting transparency in liquidity management. - Reinvestment Strategy: Of the profits generated from investments, 80% is reportedly distributed to staked token holders, while 20% is reinvested into the fund (https://www.danxrd.xyz/) . This approach aims to balance immediate rewards for holders with sustainable growth of the liquidity pool. Team The $DAN project is led by a team of individuals described as "dedicated crypto enthusiasts and seasoned investors" (https://www.danxrd.xyz/) . According to the project's website, the team works to ensure that their decisions benefit both the $DAN community and the broader Radix community. The key team members mentioned are: Key Team Members - The machinist (https://www.danxrd.xyz/) : Described as the Founder and an award-winning web3 podcast producer. No additional details about their background or specific role in the project are provided in the source material. - Yatz (https://www.danxrd.xyz/) : Listed as a $DAN Founder and "Degen Extraordinaire". The term "Degen" likely refers to "DeFi Degen", a colloquial term in the cryptocurrency space for someone deeply involved in decentralized finance projects, often taking high risks. - Mcgillz (https://www.danxrd.xyz/) : Serves as the Community Manager. No further details about their background or specific responsibilities are provided in the source material. - MedRare (https://www.danxrd.xyz/) : Described as a Moderator and Founder of Elite Ballerz NFT. This suggests involvement in other blockchain projects, particularly in the Non-Fungible Token (NFT) space, though no additional information is provided about this project or MedRare's specific role in $DAN. Roles and Responsibilities While specific roles and responsibilities for each team member are not detailed in the provided information, the team's overall mission is described as working hard to ensure that every decision benefits not only the $DAN community but the Radix community as a whole (https://www.danxrd.xyz/) . This suggests a collaborative approach to project management and a focus on the broader ecosystem in which $DAN operates. The presence of a dedicated Community Manager (Mcgillz) and a Moderator (MedRare) indicates a focus on community engagement and management, which aligns with the project's emphasis on community-centric approaches. It's important to note that while these team members are listed on the project's website, the provided information does not include detailed backgrounds, professional histories, or specific contributions of each member to the project. As with any cryptocurrency project, potential investors or community members would be advised to seek additional verification and up-to-date information about the team's credentials and ongoing involvement in the project. ## Cerberus (Consensus Protocol) URL: https://radix.wiki/contents/tech/core-protocols/cerberus-consensus-protocol Updated: 2026-02-05 History The Cerberus protocol was first described in a March 2020 paper (https://assets.website-files.com/6053f7fca5bf627283b582c2/608811e3f5d21f235392fee1_Cerberus-Whitepaper-v1.01.pdf) by Florian Cäsar, Dan Hughes (/community/dan-hughes) , Josh Primero and Stephen Thornton, and has been designed specifically for Radix’s multi-shard architecture (/contents/tech/core-concepts/sharding) . Cerberus represents the sixth iteration (/contents/history/history-of-radix) of the core technology underlying Radix. It addresses the "Weak Atom" problem that existed in the previous version of Radix's protocol, Tempo (/contents/tech/research/tempo-consensus-mechanism) . An evaluation of Cerberus was published in June 2023 in the Journal of Systems Research (https://escholarship.org/uc/jsys) under the title ‘ Cerberus: Minimalistic Multi-shard Byzantine-resilient Transaction Processing. (https://escholarship.org/uc/item/6h427354) ’ The authors’ analysis highlighted the ability of Cerberus to minimize coordination costs for multi-shard consensus. RDX Works (/ecosystem/rdx-works) - core developers of the Radix network - have signaled that a sharded version of Cerberus will be implemented in the Xi’an (/contents/tech/releases/radix-mainnet-xian) mainnet upgrade. Overview Cerberus is a consensus protocol optimized for Radix's multi-shard architecture. It aims to enable scalability, security, and decentralization. - Scalability is attained by sharding which allows parallel transaction processing across multiple shards. The cross-shard communication mechanism coordinates shards efficiently, providing linear scalability as more shards are added. - Security against Byzantine failures up to 1/3 of nodes per shard is ensured through BFT consensus within each shard. This provides resilience even under adverse conditions. - Decentralization is supported through permissionless participation of validators. The system remains secure as long as no single entity controls over 1/3 of validators in any one shard. Cerberus partitions transaction processing and state management into groups called shards (/contents/tech/core-concepts/sharding) . Each shard runs a local Byzantine Fault Tolerant (BFT) consensus algorithm to order transactions and maintain local state. This sharding technique is necessary to overcome the scalability limitations of traditional fully replicated networks in which every node must process every transaction. However, multi-shard transactions that span across shards require atomic (/contents/tech/core-concepts/atomic-composability) commitment protocols to maintain consistency. A key innovation of Cerberus is its cross-shard communication mechanism that coordinates multi-shard transactions with low overhead. For this, Cerberus utilizes UTXO (/contents/tech/core-concepts/unspent-transaction-output-utxo-model) -based sharding. In the UTXO model, data must be destroyed when it is modified and recreated in a new state. This prevents double spending across shards, avoiding the need for global transaction ordering. By leveraging UTXO transaction semantics and transaction atomicity, Cerberus minimizes the coordination required for multi-shard commitment, making cross-shard ordering unnecessary. A key capability of Cerberus is the ability to process UTXO transactions concurrently within the same shard without explicit locking or blocking. This is accomplished through an efficient inter-shard communication method between disjoint validator sets. This allows Cerberus to reduce communication and computation costs compared to existing protocols such as AHL, ByShard, Caper, Chainspace, RingBFT, and SharPer. Parallel processing across shards enables linear scalability as shards increase. At the execution layer, applications can be split by the Radix Engine (/contents/tech/core-protocols/radix-engine) into independent components compatible with Cerberus sharding. Mechanism Cerberus is an active multi-decree (3-phase commit) consensus mechanism, requiring a supermajority for state commitment and capable of processing transactions in parallel. By way of comparison, Bitcoin is a single decree mechanism (https://youtu.be/1rNeL-X40lc?t=1532) that sacrifices deterministic finality for probabilistic finality with a simpler communication protocol. Cerberus consists of three variants: - Core: The base variant, ideal for general use. - Optimistic: Optimized for scenarios with high trust. - Resilient or Pessimistic: Designed for use in less trustworthy environments. 1. Core Cerberus Core Cerberus (CCerberus) is designed to take maximum advantage of the UTXO transaction properties to minimize coordination costs. CCerberus operates under the assumption that all clients are well-behaved and will never approve conflicting transactions. The key insight in CCerberus is that the order of transaction execution does not matter for correctness as long as the atomicity of each transaction is ensured. CCerberus enforces transaction atomicity using a three-step approach: - Local inputs: Each shard checks if it has the necessary local inputs for a transaction. - When a client sends a transaction request to a shard, the shard first checks if it has all the necessary input objects specified in the transaction. - The shard uses a consensus protocol to create an ordered log of transactions. Consensus is necessary to get consistent results on object availability across replicas. - If the shard has all the specified input objects available, it adds the transaction to its local ordered log with a "pledge" to use those inputs if needed. - Cross-shard exchange: The shards exchange availability of local inputs via a single communication round. - The shard that received the client's transaction request broadcasts the transaction with its local input object status to all other shards involved in the transaction. - Cross-shard broadcast uses a "cluster sending" primitive that prevents equivocation and ensures reliable delivery. - The shard waits to receive input object status from all other relevant shards before proceeding. - Decide outcome: If all shards have indicated input availability, the transaction is committed, else aborted. - If all shards reported input object availability, the transaction is committed by constructing the output objects. Else it is aborted. - Shards defer the actual commit/abort until the consensus sequence order allows it. - A single consensus step within each shard is sufficient to totally order the commit or abort of the transaction. - Clients are notified of the commit or abort outcome from enough replicas to withstand Byzantine faults. No coordination is required between shards to agree on transaction ordering. Each shard processes transactions independently as long as transaction atomicity is maintained. The key to CCerberus's minimal coordination is relying on the transaction atomicity provided by the UTXO model. Safety is guaranteed for well-behaved clients even without inter-shard coordination on ordering. No cross-shard synchronization is needed on when to execute transactions. Each shard processes transactions independently and coordinates only to atomically commit transactions. The cross-shard exchange is performed using a cluster sending primitive that prevents equivocation. The decision to commit or abort a transaction requires just a single consensus step in each involved shard. No coordination is required between shards to agree on transaction ordering. This minimalist design allows CCerberus to scale linearly with low latency as the number of shards increases. However, CCerberus provides no liveness or safety guarantees for transactions issued by malicious clients. The simplicity of CCerberus highlights the benefits of specializing distributed ledger designs for UTXO workloads. During testing, Core Cerberus was able to achieve extremely high throughput in the order of millions of transactions per second - 5-12x higher throughput than prior generalized techniques. This demonstrates the linear scalability potential of Cerberus' approach to cross-shard coordination. 2. Optimistic Cerberus Optimistic Cerberus (OCerberus) aims to lift the assumption of only well-behaved clients made by CCerberus. OCerberus targets environments where faulty conditions are rare and optimizes for the normal case. The key change in OCerberus is integrating the detection of conflicts due to concurrent transactions into a single multi-shard consensus step. This is done by having a shard only participate in the consensus if it receives evidence that other relevant shards are also participating. In the optimistic case with no conflicts, OCerberus processes transactions using minimal coordination similar to CCerberus. Safety is still guaranteed even under malicious actions. However, progress can be disrupted by coordinated attacks that prevent consensus. OCerberus incorporates intricate recovery mechanisms to handle such malicious scenarios. Both local view change per shard and a novel global view change process across shards are defined. Overall, OCerberus reduces the latency compared to CCerberus for common transactions by combining cross-shard communication within consensus. But it has higher costs for recovery from faulty conditions. 3. Pessimistic Cerberus Pessimistic / Resilient Cerberus (PCerberus) takes a pessimistic approach and anticipates malicious behavior by adding coordination to the normal case operations. The goal of PCerberus is to handle faulty conditions including transactions from malicious clients gracefully without complicating the recovery process. PCerberus enhances CCerberus with two key changes: - Objects are destructed after the local consensus step and before cross-shard exchange. This prevents equivocation and firmly locks objects to a transaction. - A second local consensus step is added after cross-shard exchange to totally order the commit operations in each shard. By destructing objects early, safety is ensured even under concurrent transactions created by faulty clients. The second consensus step simplifies recovery by relegating conflict resolution to the local shard level. No intricate cross-shard coordination is needed during view change. Overall, PCerberus adds more coordination in the common case compared to CCerberus and OCerberus but enables simpler reasoning about safety and liveness even under attack. The extra coordination phases trade off some efficiency for resilience. PCerberus highlights the spectrum of design choices possible when co-optimizing sharding techniques with application semantics. Performance Cerberus and its peer consensus protocols were subjected to a variety of rigorous tests, including scenarios with varying numbers of replicas per shard, objects per transaction, and number of shards. Tests also included situations with malicious nodes, gauging how the various protocols would perform under adverse conditions. In these tests, all three flavors of Cerberus demonstrated superior efficiency, throughput, and latency compared to other state-of-the-art multi-shard consensus protocols. Notably, Cerberus exhibited linear scalability, achieving over ten million transactions per second (/contents/tech/core-concepts/transactions-per-second-tps) on 1024 shards. Throughput of three Cerberus protocols as a function of the number of shards. Other implementations of Cerberus In 2022, the Tari (https://www.tari.com/) project announced (https://www.tari.com/updates/2022-09-22-update-89.html) a pivot from its original layer 2 data availability network (DAN) to the Cerberus consensus algorithm. The new DAN based on Cerberus will integrate with Tari's existing Mimblewimble layer 1 to provide a network with advantages over pure proof-of-stake systems. Resources Cerberus Infographic Series - Chapter I | The Radix Blog | Radix DLT (https://www.radixdlt.com/blog/cerberus-infographic-series-chapter-i) At Radix, our strategic vision has always been to build a single decentralized network that can scale DeFi seamlessly to every single person on the planet (even though we didn’t initially know that DeFi was going to be called DeFi). (https://www.radixdlt.com/blog/cerberus-infographic-series-chapter-i) DEVELOPMENT Whitepapers 2020 Cerberus Consensus Protocol (https://assets.website-files.com/6053f7fca5bf627283b582c2/608811e3f5d21f235392fee1_Cerberus-Whitepaper-v1.01.pdf) 2023 Cerberus Consensus Protocol (https://escholarship.org/uc/item/6h427354) Peer Review JSys (https://escholarship.org/uc/item/6h427354) LEDGER Sybil Protection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) State Model Sharded (/contents/tech/core-concepts/sharding) Fault Tolerance Byzantine Fault Tolerant Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v2 Validator Node Cap n/a ## Radix Mainnet (Babylon) URL: https://radix.wiki/contents/tech/releases/radix-mainnet-babylon Updated: 2026-02-03 Babylon [ /ˈbæbəˌlɑn/ ] is the current version release of the Radix network that debuted smart contracts and the Radix Engine (/contents/tech/core-protocols/radix-engine) execution environment. The migration (https://www.youtube.com/live/DIJbfZ_xPKE) from the Alexandria (/contents/tech/releases/radix-developer-environment-alexandria) release occurred at epoch 32717 on the 28th September 2023. Features Native Assets Assets like tokens being ‘native’ means that they are in-built, first-order functions of the language (https://learn.radixdlt.com/article/what-are-native-assets) , rather than being second-order prescriptions of a smart contract. Tokens, NFTs, badges, and pool units are all ‘resources’ in this paradigm, as understood by the Radix Engine. Native assets offer benefits for both ease and security of dApp development and improved user experience in the Radix Wallet. Each asset's unique behaviour and data are expressed in the resource configuration rather than complex smart contract code, allowing the Radix Wallet to display all necessary information in a clear, human-readable way. In the future, physical scanning of Native Assets through NFC technology will also be possible. All Native Assets are held in user Smart Accounts, which include the ability to prove control through a signed challenge. Smart Accounts A smart account is a component on the Radix ledger that can hold assets and allow logical operations on those assets (https://learn.radixdlt.com/article/what-are-smart-accounts) . This includes multi-factor authentication and gated deposits. Smart accounts can be created ‘virtually’ without a transaction and are easily configured via the Radix Wallet. Account Abstraction The Babylon release will introduce account abstraction, simplifying user interactions with the Radix platform and providing an improved experience compared to Ethereum's EIP-4337. This feature will be more easily updated and adopted across the ecosystem. Radix Wallet The Radix Wallet is an iOS / Android application built specifically for Radix’s Babylon release (https://learn.radixdlt.com/article/what-is-the-radix-wallet-for-babylon) and designed for native DeFi and web3 use. The wallet will replace the current Radix desktop wallet and is expected to offer improved security and user experience, allowing users to import their Olympia JSON files and passwords without entering seed phrases online. This feature is designed to protect users from keylogger attacks. Personas A persona is a set of user data that resides in the Radix Wallet (https://learn.radixdlt.com/article/what-are-personas-and-identities) . Personas may be tailored to contain web3 information such as a wallet address as well as personal information such as an email address, phone number and address. The information is stored on-chain, reducing the burden of data liability and risk of theft. Personas can be selectively hidden and revealed when interacting with decentralized applications (dApps), which also enables password-less login on web3 websites (https://learn.radixdlt.com/article/what-are-personas-and-identities) , verified by a cryptographic ‘challenge’. Transaction Manifests Transaction manifests are a feature of the Radix Wallet that provides the user with a human-readable summary of every operation within an atomic transaction (https://learn.radixdlt.com/article/what-are-transaction-manifests) before it is signed. Additionally, transaction manifests can include authorization using badges, payment of transaction fees, and validation of resource amounts to ensure guaranteed outcomes for users. These manifests are human-readable, so developers and clients of his software can easily understand what they are signing. The final transaction is created by converting the manifest into a binary representation and cryptographically signing it. This allows them to be efficiently sent over the network and processed by the Radix Engine. Delegated Fees Once blueprints and components are live, the Radix Engine (/contents/tech/core-protocols/radix-engine) will enable transaction fees to be born by applications (https://youtu.be/qEWuLQmp8P0?t=475) rather than users. Radix Connect Radix Connect is an encrypted method to securely link a Radix Wallet on a mobile phone to a desktop instance (https://learn.radixdlt.com/article/what-is-radix-connect) by scanning a QR code on a browser extension called Radix Connector. Radix Connect is a technology that enables users to access DeFi and Web3.0 dApp (https://learn.radixdlt.com/article/what-is-radix-connect) sites on their desktops while having a mobile-first experience on their Radix wallet. As a user, you can link your Radix Wallet on your mobile phone to your desktop browser in a one-time setup process. The Radix Connector browser extension for desktop displays a QR code that the Radix mobile wallet scans to enable automatic connection when a dApp website attempts to connect to the wallet. Connections are fully encrypted end-to-end and peer-to-peer, ensuring message security and eliminating the need for central servers. Radix Connect uses WebRTC and special communication protocols for efficient transactions and wallet data transfers, even if the devices are on different networks. Additionally, the Radix wallet can connect to mobile websites via "deep linking" by passing messages directly between the wallet app, the mobile browser app, and the dApp running on top of it. Ledger-Enforced Royalties The Babylon release aims to improve the functionality of the Radix platform for developers. One of its key features is the ability for developers to add a small fee to every transaction that uses their blueprints or components and have them enforced automatically by the Radix Engine (/contents/tech/core-protocols/radix-engine) . Radix Web UI SDK The Radix Web UI SDK is a software development kit that allows developers to create user interfaces for decentralized applications on the Radix platform. It provides a set of tools and libraries for building web-based frontends that interact with the Radix Engine and the Radix Wallet. This allows developers to create custom interfaces for their dApps that are tailored to their specific needs and design requirements. Gateway SDK The Radix Gateway SDK is a software development kit that enables developers to create applications that interact with external systems and blockchains. This allows developers to connect their decentralized applications to other networks and services, such as payment gateways, data feeds, and identity providers. The Gateway SDK provides a set of tools and libraries for building custom gateways that interface with the Radix Engine and the Radix Wallet, allowing developers to create dApps that can interact with the broader ecosystem of blockchain and web3 technologies. Radix Engine Toolkit The Radix Engine Toolkit is a set of tools and libraries that developers can use to build custom extensions for the Radix Engine. These extensions can include custom transaction types, smart contracts, and other features that are not available in the core engine. The toolkit provides developers with a framework for building decentralized applications on the Radix platform. Liquid Staking Units Liquid Staking Units (LSUs) are a key feature of the upcoming Babylon update to the Radix distributed ledger. LSUs function as special tokens that users receive in exchange for staking XRD on the Radix network. The introduction of LSUs will change the way decentralized applications (dApps) interface with staking, simplifying the process for users and developers alike. When a user issues a stake request, they receive an LSU, which represents a claim on a percentage of the total amount of staked XRD held by the validator they staked with. Each LSU has a specific cost basis, depending on the price of XRD when the dApp executed the stake. To unstake all or part of an LSU, the dApp calls the unstake() function, passing the fraction of the LSU the user wants to unstake. Users receive their original XRD amount back, along with any staking emissions earned. One potential issue with this model is that users who frequently stake and unstake or dollar-cost average (DCA) may accumulate numerous LSU tokens in their wallet, each with a separate RSI. To solve this problem, a new component will be introduced to provide a simple interface for staking and unstaking. This new component will allow users to think of the LSU pool token as a single entity, abstracting away the underlying complexity. Additionally, the new component will implement a smart algorithm associated with the unstaking process, which will choose the most tax-efficient way to perform withdrawals. For most countries, this will likely be LIFO (last in, first out) to maximize the tax treatment for long-term capital gains. Token Types Soulbound tokens ‘Soulbound’ tokens cannot be transferred to another address so are useful for assets that shouldn’t be shared or sold on, such as concert tickets. Recallable Token Types Babylon will enable tokens to be leant and recalled after a set time. Test Networks RCNet v1 Radix Babylon RCnet v1 was a test network developed by RDX Works and release on the 31st March, 2023 after a series of Betanet versions. RCnet aimed to provide a comprehensive set of tools and features for developers, early integrators, and exchanges. Transaction Review within the Wallet The latest iOS Wallet preview build released with RCnet allows users to review transactions. While the functionality is still under development, users can monitor their accounts' activities, badges, application interactions, and deposits. Future updates plan to include: - User-customizable guarantees on deposits - Special views for specific transaction types - Advanced view for scrutinizing transaction manifests - Metadata-defined icons for applications, components, and resources Core API RCnet's Core API is designed to cater to key integrators like exchanges. Unlike traditional crypto transaction models that follow a "sender-recipient" pattern, Babylon uses a more complex system. The Core API aims to simplify the Babylon integration path for exchanges and third parties. Personas and Off-ledger Data Sharing Personas in the Wallet preview facilitate data sharing between users and dApps. This feature allows dApps to request personal data fields directly from the Radix Wallet, making the user experience more streamlined. Radix dApp Toolkit Formerly the Radix Connect Button, the Radix dApp Toolkit provides a unified interface for frontend developers. It allows automatic session management, data caching, and notifications about transactions and requests. Metadata Standards RCnet introduces a preliminary set of metadata standards for better integration between on- and off-ledger systems. This enables better information presentation to users and offers a defense against copycat dApps. Radix Engine Toolkit Aimed at developer community feedback, the Radix Engine Toolkit is slated to include: - A TypeScript target for frontend developers - Command-line version for Scrypto developers - New layer of functions for transaction building & signing Roadblocks - The Gateway API has yet to achieve a stable state. - Radix Off-Ledger Authentication (ROLA) tooling remains incomplete. - Fee table adjustments are still under research. RCnet v2 Babylon RCnet v2 was a test network launched on July 6, 2023, as a precursor to the Babylon mainnet upgrade. The release primarily focused on Scrypto, the scripting language for decentralized applications on the network, and the Radix Engine. RCnet v2 was released in two phases, with the first phase featuring "low-level" components such as nodes and the Radix Engine, and the second phase rolling out updated versions of additional tools and services. Developer Tools RCnet v2 brought several updates to developer interfaces and tools: - Redesigned authentication developer interface: The new system improves how developers specify authentication for their applications. - Native pools and pool units: A system for creating liquidity pools was introduced, which allows Radix Wallet to recognize "pool units" or liquidity provider tokens. - Better handling for referencing blueprints, components, and resources: The new syntax makes it easier to interact with different components and resources. - Fee table adjustments: The fee structure was revamped to reflect real-world computational costs more accurately. Scrypto Development With the release of RCnet v2, Scrypto became feature-complete for the Babylon mainnet upgrade. The scripting language underwent several enhancements: - Improved authentication methods: The new system simplifies the process of specifying roles and permissions. - Native liquidity pool blueprints: New blueprints were introduced for creating various types of liquidity pools. - Syntax and Error Handling: Syntax has been revised to make development more intuitive, and custom error messages were added to aid development. RCNet v3 Babylon RCnet v3 is a test network released on August 31, 2023, serving as the last major test network version before the Babylon mainnet upgrade. Developed to provide a stable environment for developers to build and test applications, the test network focuses on stability and minor improvements, assuring developers that applications compatible with RCnet v3 will function as expected on the Babylon mainnet. Babylon RCnet v3 has made a number of advancements primarily in areas of system stability and developer tools. This aims to assist developers in building and testing their applications more efficiently and effectively. Developer Tools Alongside the RCnet v3 release, various tools have been updated or introduced to aid developers in building and testing applications. These include: - Radix Wallet Preview & Connector Extension - Radix dApp Toolkit - Radix Engine Toolkit - Gateway API - Dashboard Scrypto Development In response to community feedback, an entirely new unit testing framework has been developed, enabling faster and more targeted testing. Additionally, decimal-related types for fixed precision math have been revised to address more real-world scenarios and to allow the compiler to enforce proper handling of potential overflow scenarios. DEVELOPMENT Launch Date Sept 28, 2023 (https://www.radixdlt.com/blog/babylon-mainnet-upgrade-complete) Antecedent Alexandria (Developer Environment) (/contents/tech/releases/radix-developer-environment-alexandria) Postcedent Xi’an (/contents/tech/releases/radix-mainnet-xian) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) LEDGER State Model Sharded (/contents/tech/core-concepts/sharding) Shard Groups 1 (/contents/tech/core-concepts/shard-groups) Sybil Protection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) Consensus Protocol Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v2 Programming Language Scrypto (/contents/tech/core-protocols/scrypto-programming-language) Networking Babylon node (https://github.com/radixdlt/babylon-node) Validator Node Cap 100 ## Radix Developer Environment (Alexandria) URL: https://radix.wiki/contents/tech/releases/radix-developer-environment-alexandria Updated: 2026-02-03 Alexandria [ /ˌæləgˈzændriə/ ] was a pre-Babylon version of the Radix network and appended the Olympia (/contents/tech/releases/radix-mainnet-olympia) release with a Scrypto (/contents/tech/core-protocols/scrypto-programming-language) developer environment for testing applications on a local network simulator. Features Alexandria brought the release of an early form of Scrypto and associated tools on the Scrypto Github repo, all of which are open source. Scrypto is a set of libraries and extensions to the popular Rust programming language that provides the asset-oriented features that define the Scrypto experience of writing smart contracts. To test Scrypto blueprints and components meaningfully, the developer needs some way of deploying that code and interacting with it. Alexandria provides a simulator of such an environment that developers can use to quickly build, test, and iterate Scrypto code, all on their local computer. Scrypto Development To help developers get started with Scrypto, Radix has launched a new Radix Developers Site, which provides links to installation instructions, documentation, examples, and more. Radix has also published a series of blog articles describing the need for Scrypto and how it is different from today's smart contract paradigm, as well as a new DeFi White Paper that collects much of the same information in one place. DEVELOPMENT Launch Date 2021-12-15 (https://www.radixdlt.com/blog/alexandria-scrypto-is-here) Antecedent Olympia (/contents/tech/releases/radix-mainnet-olympia) Postcedent Babylon (/contents/tech/releases/radix-mainnet-babylon) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) LEDGER State Model Sharded (/contents/tech/core-concepts/sharding) Shard Groups 1 (/contents/tech/core-concepts/shard-groups) Sybil Protection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) Consensus Protocol Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v2 Validator Node Cap 100 ## Stokenet URL: https://radix.wiki/contents/tech/releases/stokenet Updated: 2026-02-03 Stokenet is the persistent public test network (https://learn.radixdlt.com/article/what-is-stokenet) for Radix. Launched (https://www.radixdlt.com/blog/stokenet-is-live) in July 2021, Stokenet serves as a sandboxed environment for Radix users and project builders to test applications and the latest versions of the Radix Node software. History and Purpose Stokenet was launched (https://learn.radixdlt.com/article/what-is-stokenet) on July 7, 2021 to replace the Olympia (/contents/tech/releases/radix-mainnet-olympia) Betanet. The name "Stokenet" is a nod to Stoke-on-Trent (https://www.radixdlt.com/blog/stokenet-is-live) , England, the birthplace of Radix, paying homage to the project's roots and origins. The primary purpose (https://learn.radixdlt.com/article/what-is-stokenet) of Stokenet is to provide a parallel ledger version history to the Radix mainnet, allowing developers, node runners, and enthusiasts to interact with and validate the Radix network without the risk of impacting the actual mainnet operations. Stokenet features a faucet system (https://www.radixdlt.com/blog/stokenet-is-live) that distributes free (test) $XRD tokens, enabling users to explore and test the network's capabilities without the need for real-world value transactions. Technical Details Connect to Stokenet - Download and install the Radix Wallet (/contents/tech/core-protocols/radix-wallet) from https://wallet.radixdlt.com (https://wallet.radixdlt.com) or by scanning the QR code below: - From the Settings ⚙️ menu, click on App Settings: - Switch on Developer Mode: - Then click on Network Gateways: - Click on Add New Gateway: - Add the following gateway URL: https://babylon-stokenet-gateway.radixdlt.com (https://babylon-stokenet-gateway.radixdlt.com) Get Test $XRD - Click on your Stokenet account: - Click on the menu in the top-right corner: - Click on ‘Get XRD Test Tokens’ at the bottom of the page: Features Stokenet offers a range of features (https://www.radixdlt.com/blog/stokenet-is-live) that facilitate comprehensive testing and experimentation for the Radix ecosystem: - Parallel Ledger Version History: One of the core features of Stokenet is its ability to maintain a ledger version history that runs parallel to the Radix mainnet. This ensures that developers and node runners can accurately simulate and test various scenarios without impacting the live mainnet operations. - Faucet for Test $XRD: To enable seamless testing and exploration, Stokenet provides a faucet system that distributes free (test) XRD tokens to users. These test tokens have no real-world value but allow users to interact with the network, deploy smart contracts, and test various functionalities without the need for real-world value transactions. - Community Node Runner Participation: Stokenet is designed to be an open test environment for the Radix community. Node runners interested in testing and practicing the operation of validator nodes can join the official Radix Discord and request stake allocation. However, it's important to note that stake on Stokenet is not guaranteed, and the Radix team may adjust or reset it as needed during testing phases. - Different Address Prefix: To prevent accidental transfers between the Stokenet and the Radix mainnet, all addresses and IDs on Stokenet have a distinct human-readable prefix. This ensures that the test network addresses are clearly differentiated from mainnet addresses, mitigating the risk of unintentionally sending real tokens to a test network address or vice versa. Resources Developer Console The Radix Developer Console | Radix Developer Utilities (https://stokenet-console.radixdlt.com/deploy-package) Stokenet Dashboard The Radix Dashboard | Explorer, staking & more for Radix (https://stokenet-dashboard.radixdlt.com/) DEVELOPMENT Launch Date July 07, 2021 (https://www.radixdlt.com/blog/stokenet-is-live) Antecedent RCNet (/contents/tech/releases/rcnet) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) LEDGER State Model Sharded (/contents/tech/core-concepts/sharding) Shard Groups 1 (/contents/tech/core-concepts/shard-groups) Sybil Protection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) Consensus Protocol Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v2 Programming Language Scrypto (/contents/tech/core-protocols/scrypto-programming-language) Networking Babylon node (https://github.com/radixdlt/babylon-node) ## Radix Mainnet (Olympia) URL: https://radix.wiki/contents/tech/releases/radix-mainnet-olympia Updated: 2026-02-03 Olympia [ /oʊˈlɪmpiə/ ] was was the initial release of the Radix Public Network and introduced the core node software, desktop wallet, staking, the Radix Explorer, and the native $XRD token. Olympia did not employ the Radix Engine (/contents/tech/core-protocols/radix-engine) as a runtime environment for decentralized applications. Etymology All Radix public releases are related to the Seven Wonders of the Ancient World (https://en.wikipedia.org/wiki/Seven_Wonders_of_the_Ancient_World) . Olympia was the location of the Statue of Zeus (https://en.wikipedia.org/wiki/Statue_of_Zeus_at_Olympia) . Release Components The Olympia release provided the core components of the Radix ecosystem, which continued to expand with future releases. Key components included: Radix Node software Implemented the core Radix protocol, including Radix Engine (/contents/tech/core-protocols/radix-engine) v1 and unsharded Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) consensus. The Radix Engine The first version of Radix's replacement for the Ethereum Virtual Machine, designed to eventually run full DeFi applications. RADIX token - $XRD The native $XRD (/contents/tech/core-protocols/xrd-token) token, used for staking, network security, and transaction fees. Simple Token Creation Allowed new tokens to be created, named, minted, burned, and transacted via the JSON-RPC Radix Node API. DEVELOPMENT Launch Date 2021-07-28 (https://www.radixdlt.com/post/radix-olympia-mainnet-is-here) Antecedent eMunie (/contents/tech/research/emunie) Postcedent Alexandria (Developer Environment) (/contents/tech/releases/radix-developer-environment-alexandria) Code Repository Github 🔗 (https://github.com/radixdlt/babylon-node/blob/4e6ef24f3dff750ea3515416e27bcaa8f454cbdc/olympia-engine/) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) LEDGER State Model Sharded (/contents/tech/core-concepts/sharding) Shard Groups 1 (/contents/tech/core-concepts/shard-groups) Sybil Protection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) Consensus Protocol Hotstuff (https://hackernoon.com/hotstuff-the-consensus-protocol-behind-safestake-and-facebooks-librabft) Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v1 Validator Node Cap 100 ## RCnet URL: https://radix.wiki/contents/tech/releases/rcnet Updated: 2026-02-03 RCnet was a test network developed by Radix to provide developers with the tools and standards necessary for creating more robust applications on Radix. It has since been superceded by Stokenet (/contents/tech/releases/stokenet) . Features Transaction Review The latest iOS Wallet preview build includes an initial version of the transaction review functionality, enabling users to monitor their account transactions, view the badges they are presenting, and interact with various applications. Although the feature remains under development, it establishes a foundation for future enhancements, such as: - Customizable guarantees on deposits. - Prominent warnings for potentially high-risk transactions. - Special views for specific transaction types. - An "advanced view" for users desiring a detailed examination of transaction manifests. - Improved handling of resources created within transactions. - Metadata-defined icons for applications, components, and resources. - Links to additional screens detailing user interactions with dApps and resources. - Accurate fee estimates and a breakdown of fees between network costs and royalties. - Multiple visual adjustments and refinements. Core API for Integrators The node-provided Core API streamlines the Babylon integration process for exchanges and other third parties, allowing key integrators to work with the Radix Engine and node without the need to operate their own Gateway instance. Personas and Off-Ledger Data Sharing with dApps The updated Wallet preview now incorporates Personas, enabling developers to request personal data directly from the Radix Wallet. The current preview Wallet supports a limited set of personal data fields, with plans to expand the range of fields as development continues. Radix dApp Toolkit The Radix dApp Toolkit extends the capabilities of the Radix Connect Button, offering automatic session management, tracking and notifications for requests and transactions, data caching, and more. Metadata Updates and New Standards RCnet introduces an initial set of metadata standards designed to assist developers in ensuring proper integration of their dApps, tokens, NFTs, and other components with the Radix Wallet and other clients. Radix Engine Toolkit Expansion In response to developer feedback and the needs of early integrators, the Radix Engine Toolkit will be updated with new features, such as a TypeScript target, a command-line version for Scrypto developers, and additional functions for building and signing basic transactions. Limitations and Ongoing Development While RCnet achieved the majority of its objectives, certain areas, including the Gateway API, Radix Off-Ledger Authentication (ROLA), and fee table adjustments, remain under development. These features will be available before the Babylon release, although no specific release date has been provided. DEVELOPMENT Launch Date 2021-07-28 (https://www.radixdlt.com/post/radix-olympia-mainnet-is-here) Antecedent eMunie (/contents/tech/research/emunie) Postcedent Stokenet (/contents/tech/releases/stokenet) Code Repository https://github.com/radixdlt/babylon-node/blob/4e6ef24f3dff750ea3515416e27bcaa8f454cbdc/olympia-engine/ (https://github.com/radixdlt/babylon-node/blob/4e6ef24f3dff750ea3515416e27bcaa8f454cbdc/olympia-engine/) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) LEDGER State Model Sharded (/contents/tech/core-concepts/sharding) Shard Groups 1 (/contents/tech/core-concepts/shard-groups) Sybil Protection Delegated Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) Consensus Protocol Hotstuff (https://hackernoon.com/hotstuff-the-consensus-protocol-behind-safestake-and-facebooks-librabft) Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v1 Validator Node Cap 100 ## Tempo (Consensus Mechanism) URL: https://radix.wiki/contents/tech/research/tempo-consensus-mechanism Updated: 2026-02-03 Tempo [ /ˈtɛmˌpoʊ/ ] is a consensus protocol developed as the fifth iteration (/contents/history/history-of-radix) of Radix. First proposed in 2017 (/contents/tech/research/tempo-consensus-mechanism) , Tempo pioneered innovations such as ledger pre-sharding (/contents/tech/core-concepts/sharding) to allow for the grouping of related transactions, and ‘lazy’ consensus. The ideas contained in the Tempo (https://assets.super.so/2a0d12a5-9454-4417-b17e-d51617137fb6/files/e4f934ff-25c3-4ffe-a6dd-2bbb82714e64.pdf) and Public Node Incentives (https://file.notion.so/f/f/519183ae-9211-4e91-b978-311452f08ddd/ea3a263c-16cc-4eb2-b2fc-88a38dbe8035/2017-12-24_Radix_Public_Node_Incentives_White_Paper.pdf?id=8cef85ac-21db-4e39-827e-543a772ae5b9&table=block&spaceId=519183ae-9211-4e91-b978-311452f08ddd&expirationTimestamp=1709834400000&signature=IP7a-Sh6WBGOqS323f49ZQhiFYf8t2vMyIRD0R5ZxvU&downloadName=2017-12-24+Radix+Public+Node+Incentives+White+Paper.pdf) papers served as a foundation for Radix's subsequent Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) protocol. Background Tempo emerged from efforts to solve inherent scalability limitations in blockchain architectures like Bitcoin and Ethereum. While revolutionary, these early blockchain designs restricted transaction speeds to levels inadequate for worldwide adoption. Radix aimed to create a distributed ledger capable of serving global demand - potentially allowing billions of people to use decentralized applications. This required rethinking core components like network architecture, data structure, and consensus mechanisms. Inspiration came from areas like directed acyclic graphs, which demonstrated high transaction speeds but compromises in decentralization. No existing approach provided the combination of scalability, decentralization, and programmability Radix sought. Tempo’s core innovation was allowing the ledger to be massively sharded to enable parallel transaction processing. Consensus was also optimized for scalability using a ‘lazy’ mechanism where nodes only communicate when necessary, avoiding unnecessary global coordination. Logical clocks and temporal proofs were introduced to establish consensus and order between events. Together these new techniques aimed to fulfill the vision of a fast, decentralized, and programmable global ledger capable of serving as the infrastructure for the next generation of digital applications. On June 11th, 2019, engineers from RDX Works replayed 10 years worth of Bitcoin transactions on the Radix ledger (https://www.radixdlt.com/post/replaying-bitcoin) in under 30 minutes using the Tempo consensus mechanism. The invention of Tempo led to Dan Hughes (/community/dan-hughes) being accepted into Y Combinator in 2017. Development The core ideas behind Tempo were state sharding, lazy consensus, logical clocks, temporal proofs, and cryptographic commitments. State Sharding Sharding involved partitioning the ledger into 18.4 quintillion (18.4∗101818.4∗1018) shards, with each shard deterministically corresponding to an address, consisting of a public key and a checksum. Nodes only need to store and validate a subset of shards, which allows parallel transaction processing since each shard could process requests independently. Tempo utilized "massive presharding", fixing the number of shards at the onset rather than dynamically adjusting. This enabled smooth capacity scaling tailored to each node. Lazy Consensus Lazy consensus further optimized performance by having nodes only communicate when necessary rather than constantly. Nodes would accept requests then only check with others if a conflict arose later. This greatly reduced coordination overhead. Logical Clocks Logical clocks tracked events witnessed by each node. Nodes would attach a logical clock value to each request, creating a temporal proof. These could resolve conflicts by identifying ordering between disputed events. Cryptographic Commitments Cryptographic commitments were also implemented to hold nodes accountable for their logical clock values, preventing manipulation. Nodes had to commit to their recent requests when contributing to proofs. Testing On June 11th 2019 (https://www.radixdlt.com/blog/replaying-bitcoin) , Radix conducted a public test to demonstrate Tempo's scalability using 10 years of Bitcoin transaction history. The test replayed all Bitcoin transactions onto a Tempo test network, validating signatures and checking for double spends. The network consisted of over 1000 nodes distributed globally across data centers in North America, Europe, and Asia. Each node was assigned a subset of the massive shared state space to validate. During the test, the network exceeded 1 million transactions per second - processing the entirety of Bitcoin history in under 30 minutes. This provided proof-of-concept that Tempo could support the transaction demands of global scale applications. The test illustrated Tempo's ability to parallelize validation and remove bottlenecks through its sharded architecture and lazy consensus approach. Despite limitations in the redundancy of the test network, it validated core components of Tempo's design. The results did not demonstrate Tempo's theoretical maximum throughput, which could potentially be much higher based on the fixed shard size. However, processing 10 years of Bitcoin transactions in minutes showed the vast scalability improvements made possible by Tempo's innovations. Problems and Transition to Cerberus While the tests showed Tempo's potential for scalability, issues around finality guarantees were uncovered. Tempo's lazy consensus meant transactions could always be reverted if a conflict arose later. This meant there was no absolute finality of transactions. Attempts were made to mitigate this by finalizing transactions after a set time period with no disputes. However, it was found that network faults could still lead to irreconcilable disagreements between nodes if conflicts were missed within the finality window. This lack of transaction finality made Tempo unsuitable for real-world deployment. Radix realized a transition to a new consensus protocol would be needed to provide stronger finality guarantees. As final testing was underway, two additional attack vectors on Tempo's consensus system were discovered: - The Weak Atom Problem (https://www.radixdlt.com/blog/the-radix-2020-update-2) - Tempo could not prevent attacker control over fraudulently manufactured transactions, enabling delayed attacks. - Sybil protection issue (https://www.radixdlt.com/post/the-radix-2020-update-2) - Tempo's reputation system Mass was not inherently valuable, allowing attackers to cheaply acquire influence. Extensive internal testing and third-party auditing uncovered these vulnerabilities critical to Tempo's security. Despite exhaustive research, no short-term solutions could be found that would adequately resolve the issues. The insights from analyzing Tempo led Radix to explore combining the benefits of "traditional" BFT-style consensus with the scalability engine provided by sharding. This led to the development of Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , which built upon the sharding foundations of Tempo while adopting an improved consensus algorithm known as HotStuff. Cerberus utilized sharding and parallel instances of HotStuff consensus to retain Tempo's scalability advantages. The addition of a braiding mechanism to coordinate across shards also provided the transaction finality missing in Tempo. Further Reading 170925 Radix Tempo Whitepaper-v1.1.pdf638.0KB (https://assets.super.so/2a0d12a5-9454-4417-b17e-d51617137fb6/files/4258f4f2-3e90-40d1-81d8-d3847b3f856a/RadixDLT-Whitepaper-v1.1.pdf) 171224 Radix Public Node Incentives Whitepaper.pdf199.3KB (https://assets.super.so/2a0d12a5-9454-4417-b17e-d51617137fb6/files/3cc28606-2bad-496f-a997-59f7ad3d85b9/2017-12-24_Radix_Public_Node_Incentives_White_Paper.pdf) - https://medium.com/paradigm-research/radix-detailed-review-on-the-project-248a11c0c710 (https://medium.com/paradigm-research/radix-detailed-review-on-the-project-248a11c0c710) - https://dev.to/radixdlt/knowledgebase-update-atom-model-263i (https://dev.to/radixdlt/knowledgebase-update-atom-model-263i) DEVELOPMENT Whitepapers Tempo Whitepaper (https://assets.super.so/2a0d12a5-9454-4417-b17e-d51617137fb6/files/e4f934ff-25c3-4ffe-a6dd-2bbb82714e64.pdf) , Public Node Incentives Paper (https://file.notion.so/f/f/519183ae-9211-4e91-b978-311452f08ddd/ea3a263c-16cc-4eb2-b2fc-88a38dbe8035/2017-12-24_Radix_Public_Node_Incentives_White_Paper.pdf) LEDGER Sybil Protection Mass State Model Sharded (/contents/tech/core-concepts/sharding) (18.4 * 10^18 shards) Consensus Mechanism HotStuff (https://hackernoon.com/hotstuff-the-consensus-protocol-behind-safestake-and-facebooks-librabft) Fault Tolerance Byzantine Fault Deterministic Execution Environment Radix Engine (/contents/tech/core-protocols/radix-engine) v1 ## Flexathon URL: https://radix.wiki/contents/tech/research/flexathon Updated: 2026-02-03 Flexathon (https://flexathon.net/) [ /‘flɛksəθɔn/ ] was a 3-week hackathon project (https://docs.google.com/document/d/1HkywxibycsVjq5e63_ZgyRycBRI3Myya3xLC56tVOP8/edit#heading=h.1lj0fweix9i8) by Dan Hughes (/community/dan-hughes) to demonstrate innovations the Radix team had been pioneering in areas like consensus algorithms (/contents/tech/core-protocols/cerberus-consensus-protocol) and sharding (/contents/tech/core-concepts/sharding) . Timeline and Duration Hughes dedicated his vacation time in November-December 2020 to build out a functioning prototype sharded network exhibiting features like atomic composability. Hughes provided updates and documentation of his progress via Twitter (https://twitter.com/search?q=%40fuserleer%20%23flexathon&src=recent_search_click&f=live) and GitHub. Goals The Flexathon project was designed to showcase the capabilities enabled by the Radix team's Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) consensus protocol within a sharded (/contents/tech/core-concepts/sharding) environment. Specifically, Hughes set out to build a working implementation that exhibited atomicity, composability and decentralization across shards. The ambitious timeline was aimed to dispel skepticism and doubts about the feasibility of the Radix innovations. Implementation The implementation consisted of a mix of old and newly developed components. Code was repurposed from the Radix core, TEMPO/CAST modules developed previously, Hughes' own R&D framework as well as freshly written code to meet the hackathon requirements. The prototype was not meant to resemble future Radix public networks, merely demonstrate Cerberus fundamentals. Components Over the three weeks, Hughes built out and demonstrated capabilities including: - Network connectivity between nodes - Sending 'atoms' across the network - Atom pool voting for consensus - Early versions of features like block production and proof-of-work algorithms Impact The Flexathon prototype served as a successful proof-of-concept for the Radix team's innovations and helped validate the potential of the Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) protocol. By showcasing key capabilities like cross-shard atomicity and decentralized composability, Hughes managed to garner interest in Radix's technology vision. However, the prototype itself was not intended to be a direct precursor to later Radix public networks and substantial production hardening would be required. The piecemeal codebase, while helpful for rapid prototyping, did not meet the reliability and security standards demanded of live DLT networks. Nonetheless, the fundamental atomic composability exhibited during the demo laid the foundations for Radix's subsequent Olympia (/contents/tech/releases/radix-mainnet-olympia) release leading up to the Alexandria (/contents/tech/releases/radix-developer-environment-alexandria) mainnet. Components pioneered during Flexathon found use in further R&D by Hughes and feeds into the roadmap for the Radix protocol moving forward. ## eMunie URL: https://radix.wiki/contents/tech/research/emunie Updated: 2026-02-03 eMunie [ /‘i ˈmʌni/ ] was the name given to Dan Hughes (/community/dan-hughes) ’ distributed ledger project prior to the launch of Ethereum (~2013). History Hughes’ initial research (https://www.reddit.com/r/eMunie/) centered around the concept of a stable-coin. Additional input came from the prevailing discussions on Bitcointalk about the definition of a currency vs commodity money (such as gold) between advocates of Ludwig von Mises, Keynes and Hayek. Early Radix community members, including Shawn 'Peachy' Keehn were drawn to Hughes’ idea of a stable coin with a flexible supply model that could grow autonomously whilst maintaining parity; i.e. an autonomous AI-centric Federal Reserve, which could sense real-time demand signals and adjust the supply accordingly. This goal is still on the roadmap and 2.1bn $XRD have been reserved for that purpose. "Most of those early years were with a sizable group of us having fun with the different versions of the beta client and network that Dan would rollout daily where we could use a ‘spambot’ to simulate txns going around the world to other members to stress-test the peak throughput. Greg, one of the original founders, was notorious during these events as his slow-ass Internet often proved to be the hammer that brought down the system. (Sorry to throw you under the bus Greg! I love you bro’). Each of these tests gave Dan the insight he needed to understand the full S.W.O.T. of various network designs and harden the code further. The original eMunie client was super fun and awesome to use. Dan’s a Java-uber-whiz and was able to put the front-end together quickly and easily. That allowed him to develop some novel ideas and prototype solutions that became the genesis for some things that other projects now take the credit for developing. This never bothered Dan as he was enamored by the crypto space at the time (we all were) as it had more of a collaborative feel and ethos vs. the distinctly ruthless-naked-capitalist vibe that now seems to have prostituted Satoshi’s original idea. Some of his original concepts that are now used regularly are things like ENS (emunie naming system) which you now find with several projects. His ‘marketplace’ concept (i.e. decentralized Amazon) was also novel at that time. I remember another project (open bazaar I believe) sprung up a few months later to try it out as a separate concept. I also remember when we were originally toying with the DAG data-model. The lead founder from IOTA ( come-from-beyond (https://twitter.com/c___f___b) ) was constantly over on our forum (https://bitcointalk.org/index.php?topic=1191535.msg12531672#msg12531672) , but he was highly interested in our findings and my personal opinion is that he used that concept for the whole IOTA development framework. I’m sure there will be disagreements on that topic, but it’s how I remember it. Dan’s reasons for not pursuing the DAG-model are now easily visible for why IOTA is experiencing the challenges with the model currently. There are also a few other ‘borrowed’ concepts as well and I’ll post them as I remember them. The overarching goal for the client back then (and will be part of the driving force for any future client) is, and always will be, on ease-of-use. (i.e. Grandma proof). If it isn’t easy to use then adoption will suffer." - Shawn 'Peachy' Keehn, November, 2020, Telegram (https://t.me/radix_dlt/109689) . Features The eMunie client integrated the following features: - Wallets (with easily-remembered ENS names) – with attachments enabled. - Multi-sig wallets (simple to use) - Escrow wallets with timed payment releases - Telegram-like fully encrypted chat rooms - Private email messaging with integrated Address book. - Marketplace (Decentralized Amazon) - AppStore (Decentralized Google Play / iStore) - Browser (for private surfing) - Asset creation system - Ratings system (i.e. Yelp-like user ratings) eMunie POS Terminal Demonstration Further Reading - https://www.reddit.com/r/eMunie/ (https://www.reddit.com/r/eMunie/) - https://bitcointalk.org/index.php?topic=411366.0 (https://bitcointalk.org/index.php?topic=411366.0) - https://bitcointalk.org/index.php?topic=419529 (https://bitcointalk.org/index.php?topic=419529) - https://www.newsbtc.com/sponsored/in-conversation-with-emunie-founder-dan-hughes/ (https://www.newsbtc.com/sponsored/in-conversation-with-emunie-founder-dan-hughes/) DEVELOPMENT Founded 2013-03 Founder(s) Dan Hughes (https://www.linkedin.com/in/dan-hughes-2a6b117/) Successor Radix (/) LEDGER Sybil Protection Delegated Proof of Stake State Model Unsharded Native Asset $EMU SOCIAL Twitter @emunie_emu (https://twitter.com/emunie_emu) ## Cassandra URL: https://radix.wiki/contents/tech/research/cassandra Updated: 2026-02-03 Cassandra [ /kəˈsændrə/ ] was a research project spearheaded by Dan Hughes (/community/dan-hughes) to address two critical issues (https://www.youtube.com/watch?v=1rNeL-X40lc&t=530s) that could arise in a sharded (/contents/tech/core-concepts/sharding) network, specifically Radix’s forthcoming Xi'an (/contents/tech/releases/radix-mainnet-xian) release: - What happens if a subset of validators cease processing transactions, leading to a liveness failure? - How can validator sets be dynamically adjusted in case of liveness failures? Overview Cassandra was a collaboration between an unofficial organization called Radix Labs, the University of California Davis’ ExpoLab (https://expolab.org/) , and community members running nodes. Part of the project was to create a decentralized test version of Twitter (https://youtu.be/PzU_Tiqm4xQ) with the ability to tip users in $XRD. Etymology Cassandra was named (https://twitter.com/fuserleer/status/1352217981242269697) after the mythological Trojan priestess (https://en.wikipedia.org/wiki/Cassandra) who would accurately prophesy to her master, Apollo, but was never believed. Background Cassandra evolved out of Dan Hughes (/community/dan-hughes) ’ Flexathon (/contents/tech/research/flexathon) project in January 2021 (https://discord.com/channels/417762285172555786/803314424130961459/803316633623461899) . A technical demonstration using Twitter data (https://www.twitch.tv/videos/947651340) was premiered on Twitch on March 13th, 2021. 40% of the project was described as ‘new code’ (https://www.twitch.tv/fuserleer/clip/InnocentStrongEelDancingBaby-bK4waibRsywrTPxa) . Implementing the Xi'an (/contents/tech/releases/radix-mainnet-xian) upgrade to a fully sharded network is necessary for Radix to scale to “ support the needs of the $400 trillion global financial market (https://www.radixdlt.com/blog/designing-governance-with-400-trillion-of-potential) ”. However, transitioning to an explicitly sharded system introduces significant security, decentralization, and liveless challenges. Sharding enables the ledger to process transactions in parallel across multiple shards, but raises questions around how validators across different shards can remain in consensus on the global state. To study these issues, Hughes started the Cassandra research project from scratch (https://youtu.be/1rNeL-X40lc?si=uYMy354ucXuENMuR) , setting aside existing consensus algorithms like Cerberus and HotStuff that Radix currently uses. The objective was to take a blue sky approach to determine what components are necessary to create a secure, decentralized, and live sharded environment. Question 1: Liveness in Sharded Networks In a sharded environment, it is possible for a subset of validators across different shards to stop processing transactions, leading to a liveness failure where parts of the network stall. This can happen unintentionally due to technical issues or intentionally by adversaries looking to disrupt the network. Liveness failures are especially problematic in networks like Radix that use deterministic consensus algorithms to achieve transaction finality. These algorithms require approval from 2/3 of validators to finalize transactions. If 1/3 or more validators stop participating, the network cannot make progress. Existing consensus algorithms lacked strong provisions to ensure liveness in sharded environments. Hughes wanted to find a solution that would allow the Radix network to maintain weak liveness during partial liveness failures, so that the live parts of the network could continue making progress. Question 2: Dynamically Adjusting Validator Sets In decentralized networks, validator sets are determined algorithmically through ‘sybil protection’ mechanisms like staking (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) . However, liveness failures may require proactively removing non-participating validators from the active set. This raises a challenge: How can validators be removed without centralized control of the ledger? In addition, any validator set adjustment process could be vulnerable to malicious validators looking to attack the network without needing 1/3 of the total stake. Hughes wanted to find a solution that would allow validator sets to be adjusted in response to liveness issues, but prevent adversaries from manipulating this process to their advantage. State Model Cassandra exists on a fixed shardspace (/contents/tech/core-concepts/sharding) of 2^256 shards indexed on the Vamos (/contents/tech/core-protocols/vamos-database) database structure. Transactions are processed on a first-come, first-served basis (https://discord.com/channels/417762285172555786/803314424130961459/898666266607358032) and in parallel if the transactions use different inputs. The network scales linearly as more nodes are added. Consensus Cassandra used a hybrid probabilistic / deterministic consensus (https://www.youtube.com/watch?v=1rNeL-X40lc&t=1750s) , where round leaders within shard groups are determined by proof of work. This provides flexibility to maintain liveness and safety under different network conditions. Cassandra incorporates elements (https://youtu.be/1rNeL-X40lc?t=1740) of Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , but is distinct from it. Probabilistic Phase The probabilistic phase is driven by a proof-of-work (/contents/resources/python-scripts/proof-of-work) system. Validators who wish to propose the next set of transactions must solve a computational ‘puzzle’ to submit proposals. This makes the cost of submitting proposals proportional to computational power rather than stake. If validators submit conflicting proposals, the ledger forks until 2/3 of the validators vote in favor of the canonical version. This allows the network to dynamically adjust relative voting powers and make progress even if 1/3 or more of validators are not participating. A key advantage of Cassandra's probabilistic phase is that validators are rewarded with increased voting power on the network for providing useful proposals. This allows the network to recover from liveness failures by bringing in new validators. For example, if 50% of validators go offline, the remaining validators create proposals to make progress on their shard. As these proposals gain majority approval, the participating validators increase their voting power, which dilutes the weight of the offline validators. Over time, this allows the network to regain a 2/3 majority of participating stake needed to finalize transactions. The more stake that goes offline, the more time it takes for the network to ‘earn back’ enough voting power to resume finalizing transactions. Deterministic Phase Once 2/3 of validators agree on a proposal, the consensus process is determined by stake-weighted voting and provides finality on the accepted transactions. Effectiveness Maintaining Liveness The probabilistic proposal phase provides an inherent weak liveness guarantee, even if 1/3 or more validators stop participating. As long as some validators continue submitting proposals, the network can continue building potential blocks. This is similar to how Bitcoin's Nakamoto consensus allows the network to continue mining and extending the longest chain even if a large portion of miners disappear. Progress is slowed proportional to the lost computational power, but the network continues converging on a canonical chain. In Cassandra, validators who continue participating will eventually converge around a common branch of proposals. This allows the network to continue making progress during liveness failures until enough validators regain participation to finalize transactions again. Maintaining Safety Cassandra's hybrid model maintains a high threshold for safety guarantees, even while providing liveness flexibility. Attackers must overcome two hurdles to violate safety guarantees like double spends: - Gain majority control of proposing power to build fraudulent branches during the probabilistic phase. This requires large computational resources. - Acquire 1/3+ of voting power during the deterministic phase to finalize a fraudulent branch. This requires staking resources. Attackers lacking either sufficient compute power or stake cannot unilaterally violate safety. The cost to attack both phases makes Cassandra more robust compared to pure probabilistic or deterministic consensus alone. Advantages & Tradeoffs Advantages Cassandra provides several key advantages compared to traditional consensus algorithms: - Strong safety guarantees: Attackers face high costs to build fraudulent branches and finalize them. - Liveness guarantees: The network can make progress even with partial liveness failures. - Validator set shuffling: The network can adjust relative voting powers to recover from liveness issues. - Protection against attacks: Manipulating validator sets is costly due to the hybrid model. Tradeoffs - Slower finality time: Cross-shard coordination introduces additional latency before finality. - Throughput vs finality: Faster finality requires more centralized shard configuration. Slower finality enables higher decentralization. - Complexity: Hybrid model is more complex than pure probabilistic or deterministic algorithms. While Cassandra provides significant advantages, there are also tradeoffs to consider. The hybrid model requires more coordination which can introduce latency. However, it allows Radix to optimize for both decentralization and throughput. The Cassandra research demonstrates that maintaining both liveness and strong safety guarantees in a highly decentralized sharding environment is possible. However, it requires balancing complex tradeoffs between latency, throughput, decentralization, and simplicity. Ongoing Research and Development The Cassandra research project has demonstrated promising solutions to core problems around implementing sharding for Radix. However, significant work remains to refine, optimize, and evaluate Cassandra for production use. Optimizing the Protocol While initial results are positive, improvements can be made to Cassandra to optimize latency, throughput, attack resistance, and efficiency. Areas of ongoing research include: - Reducing cross-shard coordination overhead - Optimizing networking and message passing between shards - Improving reward mechanisms and sybil resistance - Enhancing protection against long-range attacks - Minimizing wasted work during probabilistic proposal creation Evaluating Production Viability Before deploying Cassandra, extensive testing and peer review is required to validate the algorithm and analyze any potential vulnerabilities. Research areas include: - Formal verification of safety proofs - Security audits and attack simulation - Benchmarking throughput, latency, and efficiency - Testnet trials and experimental deployment - Continued alignment with academic state of the art Integration into Xi'an While Cassandra may not be deployed as-is, components of it could improve Radix's Xi'an implementation based on HotStuff. Analyzing which specific components can be integrated is an ongoing process. The Cassandra research provides a strong foundation, but remains a work in progress. By building on these concepts, Radix aims to develop production-ready solutions for secure, decentralized, and scalable sharding to meet the demands of global finance. Media Highlight: Cassandra RadFlix! - fuserleer on Twitch (https://m.twitch.tv/videos/1280451487) DEVELOPMENT State Model Sharded (/contents/tech/core-concepts/sharding) Sybil Protection Hybrid (Proof of Work / Proof of Stake (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) ) Max. TPS 200,000 (https://x.com/fuserleer/status/1542582783234777091) (2024-05-15) dApp Repo github.com/fpieper/cassandra-dapps (https://github.com/fpieper/cassandra-dapps) ## Vamos Database URL: https://radix.wiki/contents/tech/core-protocols/vamos-database Updated: 2026-02-03 Vamos is a novel database persistence layer developed by Dan Hughes (/community/dan-hughes) , designed (https://x.com/fuserleer/status/1575839325224595457) to provide better performance than traditional databases like LevelDB, RocksDB, and BerkeleyDB under the random access workloads typical of distributed ledger networks. Background In the realm of distributed ledgers, persistent storage is crucial for tracking various elements such as UTXOs (/contents/tech/core-concepts/unspent-transaction-output-utxo-model) , balances, and smart-contract variables. Traditional databases struggle with the high-entropy, randomized nature of data in blockchain applications, primarily due to the inefficiencies of their indexing methods (like B-Trees and LSMTs) under such workloads. Existing databases generally utilize indexes to track the location of data on the disk. These databases, when dealing with high-entropy identifiers like hashes common in blockchain, face significant performance issues. This is because hashes, even of similar data, look very different, and the databases can't effectively optimize read or write operations for such data. As the size of these databases grows, especially in blockchain contexts, the performance degrades significantly due to increased disk reads required for querying the growing indexes. Concept and Design Vamos was conceptualized by Dan Hughes to overcome these limitations. The core idea revolves around improving the indexing mechanism to suit the high-entropy, randomized nature of blockchain data. Vamos employs a ‘fixed addressable shard-space’ approach for its indexes. Each index slot is pre-allocated and has a maximum capacity, ensuring that querying any key always costs a single disk read, regardless of the total number of items in the database. This approach drastically reduces the number of necessary disk reads, enhancing performance. Key Features and Innovations - Fixed Addressable Shard-Space for Indexes: This approach ensures that each key maps to a specific index slot on disk referenced by an in-memory slot table, reducing the complexity and time of queries. - Significant Performance Improvement: Early tests show Vamos maintaining a consistent 1 read per key existence check and 1-2 reads/writes per write operation with increasing data sizes, in contrast to the performance drop observed in traditional databases like BerkeleyDB. - Optimized for Commodity Hardware: Vamos is designed to work efficiently on standard hardware configurations, making it accessible for widespread use. Development and Testing - Hughes re-coded the index portion of the old Vamos code to implement these ideas. - Preliminary tests reveal that Vamos significantly outperforms BerkeleyDB under crypto-like workloads, maintaining nearly twice the performance efficiency at 250 million records. - Continuous improvements in I/O utilization and checkpointing algorithms have further enhanced Vamos's capabilities. - Vamos has been successfully integrated with Cassandra (/contents/tech/research/cassandra) , showing robust performance with no degradation even under substantial loads (500 writes per second on a 50 million tweet dataset). Future Prospects - Vamos continues to mature, showing promising results in preliminary tests and integrations. - It is poised to be used in upcoming public tests for high-throughput applications, potentially reaching throughputs of around 1 million transactions per second. - The development of Vamos indicates a potential shift in database technology, especially in the context of blockchain and crypto applications, pushing the limits of what is currently possible. ## Transaction Manifests URL: https://radix.wiki/contents/tech/core-protocols/transaction-manifests Updated: 2026-02-03 A Transaction Manifest is a human-readable description of intended actions to be executed on the Radix network. Overview Unlike traditional blockchain transaction models that rely on opaque smart contract calls, Transaction Manifests are inherently asset-oriented transactions that provide a listing of commands (https://learn.radixdlt.com/article/what-are-transaction-manifests) to the Radix network specifically focused on the movement of assets between accounts and decentralized application (dApp) components. The manifest system represents a fundamental shift in how blockchain transactions are constructed and validated. Traditional blockchain platforms like Ethereum typically use transactions that simply pass a message to a single smart contract (https://learn.radixdlt.com/article/what-are-transaction-manifests) , which then coordinates whatever should happen internally. These messages are usually hashed, making them unreadable to users, and even if they weren't hashed, they would tell users nothing about the actual outcome of the transaction. In contrast, Radix Transaction Manifests describe movements of assets in a way that mirrors real-world interactions (https://docs.radixdlt.com/docs/manifest) . They make it possible to compose multiple actions to be executed atomically by describing a sequence of component calls and movements of resources between components. For example, a simple token transfer manifest would explicitly describe withdrawing tokens from one account and depositing them into another, rather than just calling a generic "transfer" function. A key innovation of the manifest system is its ability to provide clear guarantees about transaction outcomes. The manifest can include assertions about minimum amounts of assets received or specific outcomes that must be met (https://docs.radixdlt.com/docs/manifest-instructions) , otherwise the transaction will fail. This helps protect users from unexpected results and various forms of transaction manipulation. The manifest system also enables true atomic composability without requiring special smart contract code. Because the manifest can describe asset movements between multiple components (https://docs.radixdlt.com/docs/writing-manifests) , different dApp functionalities can be seamlessly combined in a single transaction. For instance, a user could compose a transaction that swaps tokens on one decentralized exchange and immediately uses those tokens in another dApp, all guaranteed to either completely succeed or completely fail. With the introduction of subintents (also known as pre-authorizations) in the Cuttlefish network upgrade, the manifest system gained additional capabilities for coordinating multi-party transactions (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) . This allows for advanced use cases like delegated fee payments, where a dApp can pay transaction fees on behalf of users, and intent-based partial trades, where users can safely express one side of a potential trade that can be matched later. The Transaction Manifest system represents a core innovation in blockchain architecture, providing a more transparent, composable, and user-focused approach to transaction construction and validation. By making transactions human-readable and focusing on explicit asset movements, it helps users understand exactly what they're authorizing while enabling powerful new patterns of interaction between dApps and users. Key Features - Asset-Oriented Design: Transaction Manifests are built around the explicit movement of assets (https://learn.radixdlt.com/article/what-are-transaction-manifests) , reflecting real-world interactions rather than abstract smart contract state changes. This design choice makes transactions more intuitive and easier to understand. For example, a token swap transaction clearly shows assets being withdrawn from a user's account, passed to a decentralized exchange component, and the received tokens being deposited back to the user's account. - Human-Readable Format: Unlike traditional blockchain transactions that often present as opaque hexadecimal strings or hashes, Transaction Manifests use a human-readable format that describes each step of the transaction (https://docs.radixdlt.com/docs/manifest-instructions) . This transparency allows users and developers to understand exactly what a transaction will do before it's executed. The manifest syntax includes clear commands for actions like withdrawing assets, calling component methods, and making assertions about transaction outcomes. - Transaction Guarantees: The manifest system allows users to specify explicit guarantees about transaction outcomes (https://docs.radixdlt.com/docs/manifest-instructions) . Through assertion commands, users can ensure minimum amounts of assets received or specific transaction conditions that must be met. If these conditions aren't satisfied, the transaction automatically fails and rolls back, protecting users from unexpected results. - Native Atomic Composability: One of the most powerful features of Transaction Manifests is their inherent support for atomic composability. Multiple component interactions can be seamlessly combined within a single transaction (https://docs.radixdlt.com/docs/writing-manifests) , with all operations either succeeding together or failing together. This eliminates the need for complex smart contract integrations when combining different dApp functionalities. - Authorization Model: Transaction Manifests implement a sophisticated authorization system through the concept of proofs and the authorization zone. Rather than using simple account signatures, the system allows for flexible authorization rules (https://docs.radixdlt.com/docs/manifest-instructions) that can require various combinations of resource ownership, badges, or other credentials. This enables complex access control while maintaining transaction transparency. - Delegated Fee Payments: The manifest system introduces innovative approaches to transaction fees. Through the pre-authorization system, dApps can pay transaction fees on behalf of users (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) , removing the need for users to hold network tokens just to interact with applications. This solves the common "double onboarding" problem where users need both application tokens and network tokens to use a dApp. - Resource Safety: Transaction Manifests enforce strict rules about resource handling (https://docs.radixdlt.com/docs/manifest-instructions) . All resources withdrawn during a transaction must be explicitly accounted for - either deposited somewhere or burned - by the end of the transaction. This prevents resources from being accidentally lost or stranded during complex operations. - Multi-Party Transaction Support: Through the subintent system, Transaction Manifests enable sophisticated multi-party transactions (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) . Different users can independently authorize their portions of a larger atomic transaction, which can then be combined and executed as a single unit. This enables new use cases like coordinated purchases, atomic swaps, and other operations requiring multiple parties. - Transaction Preview: The manifest system supports accurate transaction previews. Because manifests explicitly describe all resource movements (https://learn.radixdlt.com/article/what-are-transaction-manifests) , wallets and other client software can show users exactly what will happen in a transaction before they sign it. This includes showing what assets will be spent, what will be received, and what components will be interacted with. Structure - Basic Components: The structure of a Radix Transaction Manifest is built around a sequence of instructions that define the transaction's behavior. Each manifest consists of commands for the type of operation, followed by zero or more arguments in manifest value syntax, and ending with a semicolon (https://docs.radixdlt.com/docs/manifest-instructions) . These instructions are processed in order by the Radix Engine, with the entire transaction rolling back if any instruction fails. - Transaction Intent: At the core of every transaction is a transaction intent (https://docs.radixdlt.com/docs/transaction-intents) , which contains: - A transaction header defining the notary and tip. - An intent header capturing validity constraints. - An optional message. - The transaction manifest itself. - Zero or more subintents. - Manifest Instructions: Instructions in the manifest fall into several main categories: - Resource management (withdrawing, depositing, and moving assets). - Method calls to components. - Authorization instructions. - Assertions and guarantees about transaction outcomes. Core Concepts - Transaction Worktop: The transaction worktop serves as a temporary storage area for resources during transaction execution (https://docs.radixdlt.com/docs/writing-manifests) . Resources returned from component calls are automatically placed on the worktop, where they can be organized into buckets for use in subsequent instructions. The worktop must be empty by the end of the transaction, ensuring all resources are properly accounted for. - Authorization Zone: The authorization zone is a specialized space that holds proofs required for transaction authorization (https://docs.radixdlt.com/docs/writing-manifests) . These proofs can come from two sources: - Signatures on the transaction, which are automatically converted to "virtual signature proofs". - Proofs returned by method calls during transaction execution. - Buckets: Buckets act as containers for resources being moved within a transaction (https://docs.radixdlt.com/docs/manifest-instructions) . They can be created from resources on the worktop and then passed as arguments to component method calls. Buckets provide a way to precisely control and track resource movements throughout the transaction. - Proofs: The proof system is central to transaction authorization (https://docs.radixdlt.com/docs/writing-manifests) . Proofs demonstrate ownership or permission to interact with components and resources. They can be: - Created from resources in buckets. - Generated from signatures. - Obtained from component method calls. Value Representation Transaction Manifests use a specific syntax for representing values (https://docs.radixdlt.com/docs/manifest-value-syntax) , which includes: - Basic leaf values (integers, strings, booleans). - Composite values (tuples, enums, arrays, maps). - Special manifest-specific values (addresses, buckets, proofs). Subintents and Pre-authorizations The manifest system supports subintents (also called pre-authorizations) (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) , which allow portions of larger transactions to be independently authorized. Each subintent contains its own manifest and can interact with its parent transaction through a yielding mechanism. This enables complex multi-party transactions while maintaining atomic execution guarantees. Notarization Every transaction requires notarization (https://docs.radixdlt.com/docs/transaction-notary) , which serves to: - Verify the correctness of transaction signatures. - Prevent tampering with signed transaction content. - Enable multi-signature coordination. - Support transaction submission control. The notary is typically the party orchestrating the transaction construction and submission process, such as a wallet application or a transaction aggregator service. Transaction Types Standard Transaction Types - General Transactions: The most common type of transaction on the Radix network is the general transaction (https://docs.radixdlt.com/docs/conforming-manifest-types) , which can include withdrawals from accounts, calls to non-account components, deposits to accounts, and proof productions. This flexible transaction type supports the majority of typical DeFi interactions while maintaining clear summaries of what matters to users: what they lose, what they gain, and what they're interacting with. - Simple Transfers: Simple transfers represent a specialized subset of transactions designed specifically for straightforward asset movements between accounts (https://docs.radixdlt.com/docs/simple-token-transfer) . These transactions are structured to be easily understood by hardware wallets and typically involve a single withdrawal from one account and a single deposit to another account. This type is particularly important for basic token transfers and integration with hardware security modules. - Pool Operations: The manifest system includes specialized transaction types for interacting with liquidity pools. These include dedicated patterns for contributing to pools and redeeming from pools (https://docs.radixdlt.com/docs/conforming-manifest-types) , ensuring these common DeFi operations can be clearly presented to users with appropriate guarantees about outcomes. - Validator Operations: Specific transaction types exist for validator-related operations (https://docs.radixdlt.com/docs/conforming-manifest-types) , including: - Staking XRD to validators and receiving liquid stake unit tokens. - Unstaking from validators and receiving claim NFTs. - Claiming unstaked XRD using claim NFTs. Advanced Transaction Types - Resource Creation: Transaction Manifests support the creation of both fungible and non-fungible resources (https://docs.radixdlt.com/docs/transaction-non-fungible-resource-creation) through dedicated manifest instructions. These transactions handle the specification of resource properties, initial supply, and metadata, enabling the creation of new tokens and NFTs directly through transactions. - Package Deployment: Transactions can deploy new packages (smart contract code) to the network. These transactions include special instructions for publishing code and setting up package configuration (https://docs.radixdlt.com/docs/manifest-instructions) , including owner roles and metadata. - Account Configuration: Special transaction types exist for updating account settings (https://docs.radixdlt.com/docs/conforming-manifest-types) , such as modifying deposit rules, resource preferences, and authorized depositor lists. These transactions allow users to control how their accounts interact with the broader network. Subintents and Multi-Party Transactions - Pre-authorization Flows: The manifest system supports sophisticated multi-party transactions through its pre-authorization system (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) . Users can authorize portions of larger atomic transactions independently, which can then be combined into complete transactions. This enables several advanced use cases: - Delegated Fee Payment: One of the most important applications of subintents is delegated fee payment, where dApps can pay transaction fees on behalf of users (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) . This solves the "double onboarding" problem by removing the requirement for users to hold network tokens just to interact with applications. - Atomic Trading: The subintent system enables intent-based partial trades (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) , where users can express one side of a desired trade through a subintent. These subintents can then be matched and combined by aggregator services to create atomic trades between parties. - Coordinated Actions: Multiple users can coordinate complex transactions (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) where all participants need to agree and participate for the transaction to succeed. This enables use cases like coordinated ticket purchases where either all participants succeed in their purchase or none do. Technical Implementation Instruction System Basic Structure Transaction Manifests adopt a bash-like grammar where each instruction contains a command type, zero or more arguments in manifest value syntax, and ends with a semicolon (https://docs.radixdlt.com/docs/manifest-instructions) . This structure allows for clear expression of transaction intent while maintaining machine processability. Instruction Categories The manifest system includes several major categories of instructions: - Resource Management: These instructions handle the movement and manipulation of resources (https://docs.radixdlt.com/docs/manifest-instructions) , including operations like: - Taking resources from the worktop. - Creating and managing buckets. - Burning resources. - Making assertions about resource quantities. - Component Interaction: Instructions for interacting with components include method calls, function calls, and specialized instructions for common operations like creating accounts, minting tokens, and managing metadata (https://docs.radixdlt.com/docs/manifest-instructions) . - Authorization Control: The authorization system uses instructions for managing proofs (https://docs.radixdlt.com/docs/manifest-instructions) , including: - Creating proofs from buckets or the authorization zone. - Cloning and dropping proofs. - Managing the authorization zone. Processing Pipeline Manifest Compilation When a transaction is created, the manifest is first processed into a standardized format (https://docs.radixdlt.com/docs/transaction-structure) . This involves: - Validating instruction syntax. - Converting named references to indexed references. - Organizing instructions into a structured format. Transaction Structure The processed transaction includes several layers (https://docs.radixdlt.com/docs/transaction-structure) : - The notarized transaction (outer layer). - The signed transaction intent. - The transaction intent. - Individual subintents. Each layer includes appropriate signatures and validation information. Intent Processing The execution of a transaction follows the intent structure (https://docs.radixdlt.com/docs/intent-structure) , with: - A single transaction intent at the root. - Zero or more subintents arranged in a tree structure. - Controlled interaction between intents through yielding mechanisms. Value Representation SBOR Implementation The manifest system uses a specific implementation of SBOR (Scrypto Binary Object Representation) (https://docs.radixdlt.com/docs/manifest-value-syntax) for encoding values. This includes: - Basic value types (numbers, strings, booleans). - Composite types (tuples, enums, arrays, maps). - Special manifest-specific types (addresses, buckets, proofs). Type Conversion When processing transactions, manifest values are converted between different SBOR implementations (https://docs.radixdlt.com/docs/manifest-value-syntax) : - Manifest SBOR (used in the transaction manifest). - Scrypto SBOR (used by the Radix Engine). Execution Model Atomic Processing All instructions in a manifest must complete successfully for the transaction to succeed (https://docs.radixdlt.com/docs/manifest) . If any instruction fails: - The entire transaction is rolled back. - No state changes are committed. - Any locked fees are consumed. Resource Tracking The execution system maintains strict tracking of resources through: - The transaction worktop. - Named buckets. - The authorization zone. All resources must be accounted for by the end of transaction execution. Multi-Intent Coordination For transactions involving multiple intents (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) , the system coordinates: - Parent-child relationships between intents. - Resource passing through yields. - Authorization verification between intents. - Atomic execution across all intents. Advantages Over Traditional Models Asset-Oriented vs Message-Based Transactions Traditional blockchain platforms like Ethereum use a message-based transaction model (https://learn.radixdlt.com/article/what-are-transaction-manifests) where transactions are simply messages passed to smart contracts. This approach requires users to trust that smart contracts will handle their assets correctly without any clear visibility into what will happen. In contrast, Radix Transaction Manifests explicitly describe the movement of assets between accounts and components, making transaction outcomes transparent and predictable. Enhanced Security Model - Prevention of Blind Signing: On traditional platforms, users often must "blind sign" transactions (https://docs.radixdlt.com/docs/conforming-manifest-types) , seeing only an opaque transaction hash or limited parameter information. This has led to numerous hacks and exploits where users unknowingly authorize malicious transactions. Transaction Manifests solve this by providing clear, human-readable descriptions of all asset movements and component interactions. - Resource Safety: The manifest system enforces strict resource tracking throughout transaction execution. Unlike traditional models where assets might be lost due to smart contract bugs (https://docs.radixdlt.com/docs/writing-manifests) , Transaction Manifests require explicit accounting for all resources, ensuring they are either properly deposited or intentionally burned by the end of the transaction. Native Composability - Direct Component Integration: Traditional platforms require special smart contract code to enable interaction between different applications. Transaction Manifests enable native atomic composability (https://learn.radixdlt.com/article/what-are-transaction-manifests) , allowing assets returned from one component to be directly passed to another without additional coordination code. - Multi-Party Transactions: The subintent system provides native support for complex multi-party transactions (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) without requiring special smart contract logic. This enables use cases like atomic swaps and coordinated purchases that would require complex trustless escrow contracts on traditional platforms. Improved User Experience - Transaction Preview: Because manifests explicitly describe all asset movements (https://docs.radixdlt.com/docs/conforming-manifest-types) , users can see exactly what will happen in a transaction before signing. This is in contrast to traditional platforms where transaction outcomes might be unclear until after execution. - Fee Delegation: The manifest system enables innovative approaches to transaction fees. Through pre-authorizations, dApps can pay transaction fees on behalf of users (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) , solving the "double onboarding" problem where users need both application tokens and network tokens to use a dApp. Technical Advantages - Flexible Authorization: Unlike traditional models that rely primarily on simple account signatures (https://docs.radixdlt.com/docs/writing-manifests) , Transaction Manifests support sophisticated authorization rules through the proof system. This enables complex access control while maintaining transaction transparency. - Transaction Guarantees: The manifest system allows users to specify explicit guarantees about transaction outcomes (https://docs.radixdlt.com/docs/manifest-instructions) through assertion commands. If these conditions aren't met, the transaction automatically fails and rolls back. This provides protection against unexpected outcomes that might occur in traditional smart contract interactions. System Architecture Benefits - State Sharding Support: The manifest system was designed with state sharding in mind (https://learn.radixdlt.com/article/what-are-transaction-manifests) , enabling the network to process transactions in parallel while maintaining atomic execution guarantees. The explicit description of asset movements and component interactions allows the network to understand transaction dependencies without requiring global state access. - Platform Evolution: Due to the fundamental nature of traditional smart contract platforms' transaction models (https://learn.radixdlt.com/article/what-are-transaction-manifests) , they cannot easily adopt manifest-like features without breaking existing applications. The Radix manifest system provides a foundation that can evolve while maintaining backward compatibility. Examples Basic Token Operations - Simple Token Transfer: The most basic application is transferring tokens between accounts. Here's an example of transferring 10 XRD: # Lock fees to pay for the transaction CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "lock_fee" Decimal("10"); # Withdraw 10 XRD from account CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("10"); # Take XRD from worktop and create a bucket TAKE_FROM_WORKTOP Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("10") Bucket("xrd_bucket"); # Deposit to recipient account CALL_METHOD Address("account_sim1cyvgx33089ukm2pl97pv4max0x40ruvfy4lt60yvya744cve475365") "deposit" Bucket("xrd_bucket"); - NFT Creation: Creating a new NFT collection (https://docs.radixdlt.com/docs/transaction-non-fungible-resource-creation) requires more complex configuration: CREATE_NON_FUNGIBLE_RESOURCE_WITH_INITIAL_SUPPLY # Owner role configuration Enum() # NFT ID type Enum() # Track supply true # NFT data schema Enum<0u8>( Enum<0u8>( Tuple( Array(), Array(), Array() ) ), Enum<0u8>(66u8), Array() ) # Initial NFTs to mint Map( NonFungibleLocalId("#1#") => Tuple( Tuple( "Name", "Description", "" ) ) ) # Access rules Tuple( Some( Tuple( Some(Enum()), Some(Enum()) ) ), None, None, None, None, None, None ) # Metadata Tuple( Map( "name" => Tuple( Some(Enum("MyNFT")), true ) ), Map() ) None; DeFi Operations - DEX Token Swap: A typical DEX swap transaction (https://docs.radixdlt.com/docs/writing-manifests) combining multiple operations: # Lock fees CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "lock_fee" Decimal("10"); # Withdraw tokens to swap CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("100"); # Prepare tokens for swap TAKE_FROM_WORKTOP Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("100") Bucket("input_tokens"); # Execute swap with minimum output guarantee CALL_METHOD Address("component_sim1q0d9pmtn6xsrsqkdxlzyjrdnc9n94n9fma3jtrrehymst2rv4k") "swap" Bucket("input_tokens"); # Assert minimum received amount ASSERT_WORKTOP_CONTAINS Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Decimal("95"); # Deposit everything received back to account CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "deposit_batch" Expression("ENTIRE_WORKTOP"); Multi-Party Transactions - Pre-authorized Swap: An example of a pre-authorized trading subintent (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) : # Verify this can only be used by specific aggregator VERIFY_PARENT Enum( Enum( Enum( Enum( NonFungibleGlobalId("resource_sim1nfxxxxxxxxxxed25sgxxxxxxxxx002236757237xxxxxxxxx8x44q5:...") ) ) ) ); # Ensure clean starting state ASSERT_WORKTOP_IS_EMPTY; # Withdraw tokens to trade CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("100"); # Prepare tokens and yield to parent TAKE_FROM_WORKTOP Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("100") Bucket("trade_tokens"); YIELD_TO_PARENT Bucket("trade_tokens"); # Expect specific token type and amount back ASSERT_WORKTOP_CONTAINS Address("resource_sim1n2q4le7dpzucmpnksxj5ku28r3t776pgk879cahgm76c2kfpz48fpj") Decimal("95"); # Deposit received tokens CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "deposit_batch" Expression("ENTIRE_WORKTOP"); - Delegated Fee Payment: Example of a dApp paying fees for a user transaction (https://docs.radixdlt.com/docs/pre-authorizations-and-subintents) : # Ensure clean starting state ASSERT_WORKTOP_IS_EMPTY; # User withdraws and deposits tokens CALL_METHOD Address("account_sim1qsw6l5leaj62zd38x8f6qhlue76f9lz0n2s49s3mzu8qczjhm2") "withdraw" Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("1"); TAKE_FROM_WORKTOP Address("resource_sim1qyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqs6d89k") Decimal("1") Bucket("xrd"); # Small amount sent back to dApp for fees CALL_METHOD Address("component_sim1cpvs7ulg02ah8mhcc84q7zsj4ta3pfz77uknu29xy50yelakkujqze") "deposit" Bucket("xrd"); # Return control to parent intent which handles fees YIELD_TO_PARENT; ## Smart Accounts URL: https://radix.wiki/contents/tech/core-protocols/smart-accounts Updated: 2026-02-03 Smart Accounts are a new type of ledger account introduced by Radix that provides advanced features for authentication, authorization, and recoverability. Overview On the Radix network, accounts are implemented as a built-in component that serves as an " intelligent storage and management (https://www.radixdlt.com/post/how-radix-smart-accounts-work-and-what-they-can-do) " system for assets. This component-based architecture allows Smart Accounts to utilize native Radix features like configurable authentication rules and badge-based authorization. According to Russell Harvey, CTO of RDX Works (/ecosystem/rdx-works) , Smart Accounts are " very specialized (https://youtu.be/Opr2nw4AUMU) " components that function similarly to smart contracts. A key advantage of Smart Accounts is the ability to eliminate reliance on a single seed phrase (https://www.radixdlt.com/post/how-radix-smart-accounts-work-and-what-they-can-do) for account recovery. Instead, Radix leverages a system based on "roles" and multi-factor authentication (https://youtu.be/Opr2nw4AUMU) to facilitate recovery and access control. Smart Accounts also allow users to configure rules around receiving token deposits (https://www.radixdlt.com/post/how-radix-smart-accounts-work-and-what-they-can-do) from third parties. Smart Accounts aim to provide the recovery flexibility of traditional applications (https://youtu.be/Opr2nw4AUMU) with the self-sovereignty of decentralization. This unique design is intended to improve user experience (https://www.radixdlt.com/post/how-radix-smart-accounts-work-and-what-they-can-do) and enable additional account features (https://youtu.be/Opr2nw4AUMU) down the road, such as subscriptions and direct debits. Features Eliminate Seed Phrases A major innovation of Radix Smart Accounts is the ability to eliminate reliance on a single seed phrase for account control and recovery. Instead, Smart Accounts utilize Radix's native badge-based authentication system. Badges are assets on the Radix ledger that can be used for access control and authorization (https://www.radixdlt.com/blog/how-radix-multi-factor-smart-accounts-work-and-what-they-can-do) . The holder of a badge can generate a temporary proof of ownership that Smart Account logic can verify to permit actions. According to the Radix blog (https://www.radixdlt.com/post/how-radix-smart-accounts-work-and-what-they-can-do) : "Because Smart Accounts can specify the "keys" (badge proofs) to its "locks" (authentication rules), the ability to withdraw assets from an account...doesn't have to be bound forever to a single private key or seed phrase." This allows Smart Accounts to natively support multi-factor authentication (https://www.radixdlt.com/blog/how-radix-multi-factor-smart-accounts-work-and-what-they-can-do) using combinations of things like mobile phones, hardware wallets, and personal questions. It also enables social recovery by granting badge tokens to trusted contacts. "you can still set it up so that your primary role generally is your phone...but the recovery and these other two roles allow the smart account to kind of manage that for you." - Russell Harvey, ‘Smart Accounts: No more seed phrases (https://youtu.be/Opr2nw4AUMU) ’. Configurable Deposit Settings Another unique capability of Radix Smart Accounts is the ability to configure rules around receiving token deposits from third parties (https://www.radixdlt.com/post/how-radix-smart-accounts-work-and-what-they-can-do) : "You have a variety of options for setting the third-party deposit rules on each of your accounts." Specific deposit settings exposed through the Radix Wallet include: - Default rules to accept all deposits, no deposits, or only previously held assets - Allow lists and deny lists for specific tokens - Whitelisting badge holders to deposit freely "One feature is locking...This is an added piece of Access Controller functionality to help address the problem of 'my phone got stolen and I can't immediately migrate to a new one'." - Russell Harvey, ‘Smart Accounts: No more seed phrases (https://youtu.be/Opr2nw4AUMU) ’. The Access Controller role system allows accounts to be put into a restricted safe mode until security settings can be updated. Additionally, Radix Smart Accounts enable fee payment delegation, allowing the user to choose which account pays fees on a per-transaction basis. Accounts can also easily produce on-ledger proofs of any assets they hold. Miscellaneous In addition to the configurable authentication, authorization, and deposit rule capabilities covered already, Radix Smart Accounts have several other utility features: - Account locking (https://youtu.be/Opr2nw4AUMU) : A security mode that restricts token withdrawals, enabled via the Access Controller recovery role. This safeguards funds if a factor like a phone is lost or stolen while migrating to a new one. - Fee payment delegation (https://youtu.be/Opr2nw4AUMU) : Smart Account owners can choose on a per-transaction basis which account will pay the network fees. This is not possible on networks without native fee payment abstractions. - Proof production (https://youtu.be/Opr2nw4AUMU) : Assets held in a Smart Account can have their ownership easily proven on-ledger. This allows frictionless integration with dApps that leverage Radix's decentralized identity system. The component model and native network features provide flexibility for additional account features in future protocol updates. Implementation Radix Smart Accounts are made possible due to the unique architecture of the Radix network. Smart Accounts are "a very specialized [component] that definitely is a smart contract." Several native platform capabilities enable the functionality of Smart Accounts: - Asset-oriented transactions (https://youtu.be/Opr2nw4AUMU) : Transactions specify how assets flow between accounts and components, allowing intuitive withdrawal/deposit rules. - Configurable authentication rules (https://youtu.be/Opr2nw4AUMU) : Radix has a native auth system for components to define access control requirements based on badge proofs rather than a single key. - Component model (https://youtu.be/Opr2nw4AUMU) : Components are Radix's version of smart contracts that can hold real on-ledger assets, unlike accounts on other networks. "Because assets are so fundamental to the Radix platform, the accounts that hold them have to be just as fundamental." - Russell Harvey, ‘Smart Accounts: No more seed phrases (https://youtu.be/Opr2nw4AUMU) ’. Comparison with Ethereum A key differentiation between Radix Smart Accounts and previous account systems is the avoidance of complexity from tacked-on abstractions. Ethereum's proposal for "account abstraction" exemplifies the challenge of bolting advanced features onto an existing protocol not designed for them. Ethereum’s ERC-4337 specification aims to " enable the possibility of decentralized multi-factor account control and recovery (https://www.radixdlt.com/post/comparing-account-abstraction-and-radix-smart-accounts) – no magical single seed phrase required.” However, due to the requirement of not disrupting Ethereum's core transaction logic, account abstraction involves: - A separate user operation pool and bundle marketplace - Special "bundler" node infrastructure - Additional reputation scoring mechanisms - Custom factory and entry point smart contracts This introduces significant complexity (https://www.radixdlt.com/post/comparing-account-abstraction-and-radix-smart-accounts) to achieve account flexibility, risking fragility, inefficiency, and compatibility issues. In contrast, the Radix network was built from the ground up (https://youtu.be/Opr2nw4AUMU) with native building blocks for asset manipulation and configurable access control. Transactions specify asset flows between components, with authorization rules set in a component's logic. Rather than a complex add-on, Smart Accounts are simply instances of Radix's component model (https://youtu.be/Opr2nw4AUMU) with additional logic, fitting seamlessly into the standard transaction flow. This fundamental architectural divergence highlights the limitations of attempting to stretch existing protocols versus engineering purpose-built designs optimized for usability. The intuitive simplicity of Radix Smart Accounts provides a more sustainable path toward mainstream adoption by eliminating user experience barriers. ## Scrypto (Programming Language) URL: https://radix.wiki/contents/tech/core-protocols/scrypto-programming-language Updated: 2026-02-03 Scrypto (https://github.com/radixdlt/radixdlt-scrypto) [ /’skrɪptoʊ/ ] is an open-source, smart-contract programming language, designed specifically for the development of decentralized applications (dApps) on Radix. Overview Scrypto exists as a set of libraries and compiler extensions that add features, syntax and data types to the Rust (https://www.rust-lang.org/) programming language, allowing for an ‘asset-oriented’ style of programming that treats tokens, NFTs and other ‘resources’ as native objects. Features Scrypto introduces several key features designed specifically for Radix development: - Asset-oriented programming model: Unlike account-based models like Ethereum, Scrypto has native support for tokens, NFTs, and other digital assets as first-class primitives. Resources like tokens can be directly stored, transferred, and manipulated. - Finite state machine model for tokens: Token behavior in Scrypto adheres to a predefined finite state machine. This provides predictable rules around things like minting, burning, transferring, and admin capabilities. The model improves security and avoids surprises. - Access control using ‘badges': Instead of checking the message caller's address for authorization like Solidity, Scrypto uses "badges" - tokens or NFTs - to dictate access rules. This enables complex permissions not tied to accounts. - Built-in royalty system: Smart contract developers can configure royalties to be automatically collected when their contracts are used. This incentivizes creation of reusable blueprints. - Strong typing and compile-time checks: Scrypto leverages Rust's strong type system and compile-time checks to improve security and developer experience. Many common bugs are caught during compilation. - Developer tooling: Scrypto includes a command line tool, simulator, testing framework, and other tooling to enable rapid smart contract development. Syntax and Tools Scrypto adopts a syntax similar to the Rust programming language, as it is built on the Rust compiler toolchain. Like Rust, Scrypto utilizes: - Statically typed variables - Immutability by default - Explicit variable ownership - Pattern matching - Trait-based generics - Closure expressions However, Scrypto extends Rust by adding: - New primitive data types like tokens, NFTs, badges, etc. - Attributes like #[blueprint] to denote smart contract blueprints - Built-in functions for blockchain interaction - Removal of non-deterministic features like floating point numbers Scrypto code compiles to WebAssembly (WASM) to run on the Radix virtual machine. The key tools available in the Scrypto developer toolkit include: - Scrypto CLI: Command line tool for building, testing, and deploying Scrypto packages. It wraps cargo for blockchain workflows. - Radix Engine Simulator (Resim): Local simulator that emulates the Radix ledger and runtime. Useful for development testing. - Testing Framework: Tools designed for unit testing Scrypto blueprints and components. - Radix Dashboard: GUI for deploying and interacting with Scrypto packages and components. - Wallet SDK: APIs for connecting Scrypto dApps with the Radix Wallet. - ROLA: Off-ledger authentication system allowing server apps to connect with on-ledger identities. The toolkit enables rapid development, testing, and deployment of end-to-end decentralized applications with Scrypto. Blueprints Scrypto introduces a concept called ‘blueprints’ - similar to classes in object oriented programming - to build reusable, modular smart contract components. Blueprints define the structure and logic for components, including: - State variables to hold token supplies, user balances etc. - Functions to instantiate new component instances - Methods to modify state and execute business logic Components are then live instances created from blueprints, like objects instantiated from classes. This separation of blueprints and components enables: - Reusable code: Components created from the same blueprint have shared logic but separate state. Similar to how classes define objects. - Upgradability: Components can be instantiated from new blueprint versions to get upgrades. The blueprint system facilitates iterating code. - Composability: Components can easily call other components to form complex decentralized apps, with clear interfaces between the parts. - Developer incentives: Blueprint authors can charge royalties when others utilize their blueprints to build applications. This funds open source development. Scrypto developers are encouraged to build small, modular blueprints that "do one thing well" - following Unix philosophy - rather than massive monolithic contracts. Packages A Scrypto package is a bundle of one or more blueprints that is published to the Radix ledger. Packages have a unique on-ledger package address once deployed. This allows them to be called and reused by other packages and applications. Scrypto code is modularized into packages to promote reusability. Some benefits include: - Code Organization: Logically grouping related blueprints and functionality into a package. - Namespace Control: Packages manage identifier namespaces for state variables, functions etc inside them. - Encapsulation: Packages can decide which blueprint functions/methods to expose publicly for calling. - Versioning: Packages can publish updates while maintaining addresses and backwards compatibility. - Royalties: Package creators can charge usage fees when downstream code calls package functions. - Sharing: Packages enable discovering and importing reusable Scrypto code modules. The Scrypto CLI provides commands for packaging, testing and deploying Scrypto code to the Radix ledger as ready to use packages. There is growing developer momentum around publishing packages covering areas like token contracts, lending, identity, governance and more. Components Components are a core building block in Scrypto for encapsulating logic and holding state. They are similar to smart contracts in other blockchain platforms, but have some key differences: Components are defined by creating a Rust struct inside a blueprint module, like this: Copy #[blueprint] mod my_component { struct MyComponent { my_field: Type, } } The struct defines the fields that will be stored in the component's state when it is instantiated. To instantiate a component from a blueprint, call the .instantiate() method on the struct: Copy let my_component = MyComponent { my_field: value }.instantiate(); This will create a new instance of the MyComponent struct and return a reference to it. At this point, the component instance exists locally in the transaction. To make it accessible globally, it needs to be "globalized" using the .globalize() method: Copy my_component.instantiate() .prepare_to_globalize(OwnerRole::None) .globalize() To call methods on a component, you reference the component instance and call the method: Copy my_component.some_method(); Components can also hold resources like tokens in vaults and buckets: Copy struct MyComponent { my_vault: Vault } // Put tokens in vault my_component.my_vault.put(some_bucket); // Take tokens out let bucket = my_component.my_vault.take(100); - Components encapsulate state and logic - Defined via Rust structs inside blueprints - Instantiated from blueprint structs - Need to be globalized to be publicly accessible - Interact by calling methods and accessing state Comparison to Other Languages Scrypto differs from other smart contract languages like Solidity and Move in several key aspects: - Compared to Solidity, Scrypto code is more intuitive and easier to reason about due to its asset-oriented model. The finite state machine for assets also provides stronger security guarantees around token behavior. - Unlike Move's account-based programming model, Scrypto was built from the ground up for asset-oriented development. Assets are primitive rather than secondary abstractions derived from accounts. - Scrypto is the first smart contract language designed specifically for predictable and secure asset handling. Existing languages were retrofitted for blockchain use cases. - Strong typing and compile-time checks give Scrypto better security properties out of the box compared to untyped languages like Solidity. - Access control using badges rather than relying on caller addresses avoids issues like privilege escalation exploits seen in other platforms. - Scrypto has native blockchain tooling for development, testing, and deployment lacking in general purpose languages used for smart contracts. Overall, Scrypto aims to combine the performance and checkability of Rust with domain-driven design tailored specifically for decentralized applications and finance. Resources Installation instructions, examples and resources are available on the Radix Developer Resources (/contents/resources/radix-developer-resources) page. DEVELOPMENT Paradigm Imperative (https://en.wikipedia.org/wiki/Imperative_programming) ( Procedural (https://en.wikipedia.org/wiki/Procedural_programming) , Object-Oriented (https://en.wikipedia.org/wiki/Object-oriented_programming) ); Supports Functional (https://en.wikipedia.org/wiki/Functional_programming) programming; Structured (https://en.wikipedia.org/wiki/Structured_programming) ; Developer(s) RDX Works (/ecosystem/rdx-works) Initial Release 2021-12-15 (https://www.radixdlt.com/post/alexandria-scrypto-is-here) Latest Release v1.3 (Cuttlefish) (https://github.com/radixdlt/radixdlt-scrypto/releases) Documentation docs-babylon.radixdlt.com/main/scrypto/introduction.html (http://docs-babylon.radixdlt.com/main/scrypto/introduction.html) Github Repo github.com/radixdlt/radixdlt-scrypto (https://github.com/radixdlt/radixdlt-scrypto) License Radix License, v1 (http://radixfoundation.org/licenses/license-v1) Community t.me/RadixDevelopers (https://t.me/RadixDevelopers) ## Radix Wallet URL: https://radix.wiki/contents/tech/core-protocols/radix-wallet Updated: 2026-02-03 The Radix Wallet (https://wallet.radixdlt.com/) is a mobile-first cryptocurrency wallet built by RDX Works (/ecosystem/rdx-works) , developed to manage $XRD and other tokens on the Radix network. The wallet acts as a gateway to decentralized finance (DeFi) applications built on Radix, enabling users to securely store assets, connect to dApps, and access a range of web3 services. It can be downloaded from https://wallet.radixdlt.com (https://wallet.radixdlt.com) . Overview The Radix Wallet is intended to serve as the primary gateway for users to access and manage their digital assets and identities on the Radix network. It functions similarly to wallets on other cryptocurrency networks like MetaMask or Trust Wallet, but includes additional features specific to leveraging Radix network capabilities like smart accounts and transaction manifests. A key focus in the wallet's development was ease of use, asset safety, and availability across devices. This was enabled by innovations like multi-factor security, simplified recovery from lost credentials, seamless connectivity between mobile and desktop environments, and displaying transparent insights on all wallet assets. The result is a crypto wallet designed to bring an uncomplicated mainstream user experience to interacting with Web3 services. The Radix Wallet Features - Safety and Recovery: The wallet allows for multi-factor security and recovery of accounts (https://youtu.be/iynGh7rPKmM) , preventing loss of funds if a user's phone is lost. This is enabled through the use of smart accounts (https://learn.radixdlt.com/article/what-are-smart-accounts) on the Radix ledger. - Availability Everywhere: Radix Connect technology (https://learn.radixdlt.com/article/what-is-radix-connect) facilitates connectivity between the mobile wallet and desktop browsers for a seamless user experience across devices. This uses peer-to-peer WebRTC connections (https://youtu.be/iynGh7rPKmM) rather than a centralized server. - Intuitive Design: The wallet development prioritized simplified onboarding (https://youtu.be/iynGh7rPKmM) and ease of use based on extensive user experience testing. This testing led to significant changes in the original account model (https://youtu.be/iynGh7rPKmM) . - Understanding Assets: The native support for assets on the Radix network (https://learn.radixdlt.com/article/what-are-native-assets) means the wallet can directly access configurations and metadata to display transparent insights on owned tokens, NFTs, liquidity pools and other assets. Components Smart Accounts Smart Accounts (/contents/tech/core-protocols/smart-accounts) are components on the Radix ledger where users hold their assets. They are similar to accounts on other crypto networks but provide additional functionality. Each Smart Account has an address that can be used to receive tokens and assets. The Radix Wallet allows users to easily manage multiple Smart Accounts. Smart Accounts contain built-in ‘vaults’ for holding native assets, instead of relying on separate token contracts like other platforms. This allows for direct balance lookups rather than cross-referencing various contracts. Smart Accounts also include programmable logic, such as the ability to configure multi-factor control and recovery. By using "badges" and Access Controllers, robust security schemes can be implemented without users having to secure a single seed phrase. Preferences can also be set in Smart Accounts, like whether to accept deposits of unknown tokens or reject them. Because Smart Accounts are Radix components, they can be created "virtually" without issuing an on-ledger transaction. This allows generating an address and receiving assets quickly and easily. Personas Personas (/contents/tech/core-protocols/personas) provide an easy way for users to login to Radix dApps without using passwords. They also facilitate selective sharing of user data with websites. Personas are created in the Radix Wallet. Each one has a name and can have associated data like an email address or shipping info. Users can utilize one main Persona or create multiples suited for different contexts. Behind the scenes, each Persona links to an Identity component on the Radix ledger. The Identity allows cryptographically signing “challenges” to prove ownership without revealing personal data. Upon logging into a site with a Persona, the website can request access to specific data like a mailing address. The wallet will ask the user’s permission before sharing any data. This eliminates websites needing to store user personal information to deliver customized experiences. Users retain control through the wallet to update data or revoke access. Radix Connect Radix Connect (/contents/tech/core-protocols/radix-connect) enables the mobile-centric Radix Wallet to seamlessly interact with dApps on desktop websites. The one-time setup involves installing a Radix Connect browser extension that displays a QR code, which is then scanned by the mobile wallet. This establishes an encrypted peer-to-peer connection between the wallet and browser that uses WebRTC technology, eliminating the need for a centralized server relay. The connection automatically activates when accessing desktop dApps requiring wallet interaction. For mobile browsers, the wallet utilizes deep linking to directly communicate with dApps rather than Radix Connect. Native Assets Native Assets (https://learn.radixdlt.com/article/what-are-native-assets) refer to the built-in support for representing resources like tokens, NFTs, badges, and other assets on the Radix network. Configuring Native Asset behavior does not require complex smart contract code. Instead, attributes are specified in the resource definition for each asset. This allows the Radix Wallet to directly access the data and properties of all Native Assets in a user's account and display relevant insights. For example, NFT metadata can be shown in-wallet rather than needing to visit a third party site. The transparency of asset configurations also facilitates visibility of token permissions, such as whether it permits being recalled by the issuer. Transaction Manifests Transaction Manifests (/contents/tech/core-protocols/transaction-manifests) are a more transparent form of transacting on the Radix network compared to typical transactions. Rather than a hashed message to a smart contract address, Manifests explicitly list asset movements between accounts and dApps that will occur. This allows the Radix Wallet to provide a clear summary of exactly what will happen for a user to review and prevent unexpected outcomes. Transaction Manifests also inherently enable atomic composability between components without special coordination code. Roadmap The Radix team uses a roadmap to guide the continuing development of the wallet, structuring it into three main sections - Milestones, Underway, and Backlog. Milestones represent significant research and development efforts expected to take multiple months. The Underway section highlights features currently in active development with developers working on coding them. Finally, the Backlog contains small and medium scale features that will be pulled into upcoming releases. This combination of longer-term milestone initiatives and incremental feature building aims to progressively enhance the functionality of the wallet while ensuring architectural soundness. The roadmap priorities also balance innovations that improve security, privacy and the user experience. Understanding this structure provides insight into the Radix team's strategy for enabling seamless, secure decentralized experiences for mainstream users via the mobile wallet. Multi-Factor Account Control & Recovery One of the most anticipated wallet features is robust multi-factor authentication for enhanced account security. The Radix public ledger already enables complex multi-signature configurations and access control at the protocol layer. However, streamlining these capabilities for easy use through the wallet UI/UX requires significant design considerations. The multi-factor milestone will allow users to set up and manage additional sign-in requirements per account, such as a hardware security keys or biometric factors. They could also configure advanced recovery methods, like social or custodial account recovery. Behind the scenes, this integrates with the underlying component-based architecture of Radix accounts. To translate the technical possibilities into an intuitive user experience, substantial work is needed to handle use cases and edge cases around multi-factor setup, modification, recovery initiation, device switching, and more. As such, this milestone initiative encompasses numerous complex flows that the Radix team aims to simplify as much as possible. The end goal is for multi-factor security to feel natural and unobtrusive during typical wallet interactions. Mobile-to-Mobile dApp Support While the Radix wallet already enables connecting with desktop-based dApps, allowing direct integration between the mobile wallet and mobile dApps unlocks additional capabilities. Users increasingly access decentralized apps on the go, not just from a computer. Under this milestone, the team will focus on seamless linking when a user wants to connect their Radix wallet to a dApp on their phone. For example, clicking “Connect Wallet” within a DeFi game app could automatically prompt the user to select accounts and confirm authorization back in the Radix wallet. Behind the scenes, this leverages Android/iOS functionality for deep linking between apps on the same mobile device. Another motivation for direct mobile-to-mobile connectivity is enabling transactions initiated in a dApp to trigger wallet confirmation in a streamlined manner. This provides a smooth user flow while still ensuring transactions authenticate via the user’s primary wallet rather than granting open-ended permissions to the dApp itself. The Radix team needs to determine optimal technical mechanisms to achieve secure app-to-wallet communication flows on mobile. But the end result will be convenient experiences for accessing decentralized apps directly from your mobile device without extensive switching or sacrificing user security. Push notifications and Requests Push notifications are a familiar concept for mobile apps, but their implementation for a decentralized wallet requires some additional considerations. Rather than just broadcasting informational alerts, Radix intends push notifications to enable transaction requests between the wallet, dApps, and even other users. This ties into the Radix Connect sub-component for transmitting requests across the broader Radix ecosystem. Users may take wallet actions like setting up a transaction in a dApp or through a peer interaction which necessitates confirmation back in their primary wallet. Enabling push requests allows triggering these approvals without constant app switching. For example, an ecommerce dApp could initialize an order transaction that gets securely pushed to the user’s wallet as a request to sign. Or a friend could request funds with notification instantly prompting the user to open their wallet and process it. The vision is to support notifications, signatures, and UI flows which smooth the user experience without compromising security model fundamentals. As Radix Connect is further built out to transmit transactions, data, and other signed payloads between parties, reimagining push mechanisms to take advantage of these capacities unlocks entirely new on-the-go blockchain user experiences. Multiparty Signing Unlike the single signer model common in crypto, Radix evolved its transaction manifest architecture to enable multi-signature capabilities at the protocol level. This allows formally specifying signatures from multiple parties as a requirement for a valid transaction. For example, a DeFi governance token could mandate confirmations from 2 out of 3 authorized board members. However, crafting intuitive wallet flows to coordinate the collaborative actions of multiple signers poses daunting UI/UX challenges. Each participant needs to review and ratify their component of a complex collective transaction. This necessitates passing transaction payloads across multiple wallets to accumulate the Threshold of approvals. Designing smooth experiences for cooperative multi-party transactions encompasses streamlining activities such as partial signing, amending transaction details, establishing required badges, and notarization. At each phase, careful considerations around user roles, account and permission configuration, security guarantees, contingency scenarios, and more play a part as well. By tackling these intricacies, Radix lays the groundwork for innovations ranging from social payment splitting with friends to formal business decisions requiring board resolutions. The result aims to maintain exceptional ease-of-use regardless implementation complexity underneath. Sessions / Streaming Mode Sessions and streaming introduces the concept of bounded wallet permissions to enhance dApp integration. Rather than blanket authorizations, users should be able to intentionally configure conditional account access tied to a particular live session. For example, signing into a web3 game could trigger carefully scoped permissions allowing the game to transact with certain tokens or NFTs during gameplay. However, these privileges would expire after the session closes rather than persist indefinitely. That way users avoid risky overauthorization yet still enjoy convenient experiences. A major goal is enabling one-time permissions configuration to minimize wallet auth interruptions after starting a game or application. This may leverage planned functionality like smart accounts version two for governing fine-grained component access. The principles of user security and control still apply within a session’s defined parameters. Altogether, sessions and streaming builds on Radix’s componentized transaction architecture to balance application requirements with purpose-bound authorization windows. The nuanced approach aspires to eliminate custody arrangements or excessive interruptions while using apps needing account interactions. NFC Scanning NFC scanning functionality would extend the wallet’s connectivity to external environments like events or retail. With appropriate mass adoption, NFC readers integrated by third parties could request verification of assets directly from someone’s mobile wallet via proximity communication protocols. For example, gaining entry to a conference could involve scanning your phone to validate tickets rather than showing a QR code. Or a point-of-sale could confirm sufficient funds for a micropayment upon tapping. These demonstrate more embedded, real-world use cases unlocked by NFC capacities. On a technical level, NFC scanning fits as a new communication interface for the Radix Connect subsystem. It offers another channel for signing or sharing proofs of ownership/account status. This would require handling communication handshakes, transfer mechanisms, UI notifications, and appropriate permissioning. While further along the roadmap timeline, enabling seamless NFC and external smart device syncs illustrates the expanding Crypto-to-Real-World integrations on Radix’s ambitious horizon. As community developers ideate these innovative bridges, the team keeps pace preparing supportive wallet tooling. ## Radix Connect URL: https://radix.wiki/contents/tech/core-protocols/radix-connect Updated: 2026-02-03 Radix Connect is a framework designed to facilitate seamless interactions between mobile wallets and desktop browsers, specifically tailored for decentralized applications (dApps) on the Radix network. Central to this technology is the Radix Connector Browser Extension, and the Radix Wallet (/contents/tech/core-protocols/radix-wallet) , a mobile-first tool that allows easy use of DeFi and Web3.0 dApps on desktop environments. Radix Connect Functionality Radix Connect's primary function (https://learn.radixdlt.com/article/what-is-radix-connect) is to establish a secure, end-to-end encrypted, peer-to-peer connection between mobile wallets and desktop browsers. It features a straightforward one-time setup flow, involving a QR code scan by the Radix Wallet to link with the user's desktop browser. This encrypted connection is designed to be robust and efficient, regardless of the network conditions. Radix Connector Browser Extension A pivotal element of the Radix Connect system is the Radix Connector browser extension (https://learn.radixdlt.com/article/what-is-the-radix-connector-browser-extension) . This extension serves as a conduit linking the Radix Wallet on iOS or Android devices with dApp websites on desktop browsers. It differs from conventional wallet extensions like Metamask in that it doesn't store tokens or private keys but focuses on ensuring a secure connection to protect against potential cyber threats. Overview - Functionality: The Radix Connector operates in the background, creating a bridge between the Radix Wallet on iOS or Android mobile phones and dApp websites running on a desktop web browser. It does not function as a wallet itself and does not store tokens or private keys. - Primary Use: The primary purpose of the Radix Connector is to facilitate a secure and seamless connection for transactions and interactions with dApps while maintaining user privacy and security. Technical Aspects - Connection Protocol: The connection established by Radix Connect is end-to-end encrypted and peer-to-peer (P2P), meaning there is no need for a centralized server to relay messages. This ensures privacy and reduces the risk of interception or manipulation. - Installation and Pairing: Users can install the Radix Connect extension for Chrome from the official Radix website. After installation, a QR code is generated, which users scan using their mobile Radix Wallet to establish a secure link between their phone and desktop browser. - Security Measures: Radix Connect prioritizes security, providing critical information to the Radix Wallet about the dApp webpages connected, which is used to prevent various types of cyber attacks, including "man-in-the-middle" attacks involving fake or misleading websites. User Experience - Ease of Use: The extension offers a one-time setup flow for linking the mobile wallet to the desktop browser, simplifying the user experience and enhancing accessibility for interacting with dApps. - Mobility: Radix Connect allows the Radix Wallet to offer a mobile-first experience, enabling users to manage their digital assets and interact with dApps while on the go. Radix dApp Toolkit √ Connect Button At the heart of Radix Connect's user experience is the Radix dApp Toolkit’s √ Connect button (https://www.radixdlt.com/blog/using-radix-dapp-toolkits-connect-button-for-great-wallet-connection-ux) . This toolkit, a typescript library, provides essential frontend functionalities for dApps. It simplifies the user experience by handling tasks like Persona logins, user data caching, and browser session management. The √ Connect button acts as a visual and interactive element on dApp websites, representing the user’s Radix Wallet. It displays the user's login status, the Persona logged in with, and details of the data shared with the dApp. It also offers the functionality for users to update shared data or log out, creating a consistent and user-friendly interface across different dApps. User Login Management The Radix dApp Toolkit (https://www.radixdlt.com/blog/using-radix-dapp-toolkits-connect-button-for-great-wallet-connection-ux) also assists in managing user login states. It automatically recognizes a user as logged in upon selecting a Persona, reflected in the state of the √ Connect button. For dApps requiring additional verification, the toolkit can be configured accordingly. The "Update Data Sharing" feature further enhances user control, enabling them to refresh their shared data permissions directly from their wallet. ## Personas URL: https://radix.wiki/contents/tech/core-protocols/personas Updated: 2026-02-03 Personas is a digital identity management feature (https://www.radixdlt.com/blog/radix-wallet-personas) introduced by Radix as part of its Q2 2023 network upgrade to Babylon (/contents/tech/releases/radix-mainnet-babylon) . It represents a significant shift in user authentication and identity management in the context of Web3 technologies. Overview Personas were developed to address the limitations of traditional Web2 and emerging Web3 authentication systems. In Web2, users often rely on a centralized system of passwords and logins, leading to security risks and privacy concerns. The advent of Web3 introduced wallet-based logins that, while more secure and trustless, do not offer flexibility in information sharing and are still tied to what users own rather than who they are. Functionality Radix Personas aim to separate a user's online identity from their asset-holding accounts, enhancing privacy and user control. This decentralized system uses a component known as "identity" on the Radix ledger. It allows users to access decentralized applications (dApps) based on their identity rather than their assets. This approach contrasts with centralized systems like Passkeys, where control over access is held by a single entity, and traditional Web3 wallets, where all wallet contents are shared upon connection. Process The login process with Radix Personas involves several steps: - Connection to a dApp on Radix using Radix Connect. - The user's Radix Wallet sends a unique cryptographic data piece. - The dApp verifies this data against identity data on the Radix Network. - Upon successful verification, the user logs in without needing a password. User Control and Privacy Radix Personas provide users with control over the information shared with websites, differing from both Web2 and traditional Web3 solutions. Users can have multiple personas and associate various personal data with each, deciding what information to share with each service. Impact The introduction of Personas by Radix is seen as a step towards enhancing user sovereignty, privacy, and control in the digital domain, addressing some of the critical challenges faced in both Web2 and Web3 identity management systems. ## NFTs on Radix URL: https://radix.wiki/contents/tech/core-protocols/nfts-on-radix Updated: 2026-02-03 Non-fungible tokens (NFTs) are digital assets that represent unique items like artwork, collectibles, and in-game items. On Radix, NFTs are core primitives. Overview On smart contract platforms such as Ethereum, NFTs are typically implemented using complex token standards and contracts. In contrast, Radix recognizes NFTs as core primitives. This intuitive architecture simplifies NFT development and ownership, while unlocking advanced functionality through composability with other Radix components. Implementation NFTs on Radix are ‘resource’ objects in Scrypto (/contents/tech/core-protocols/scrypto-programming-language) and natively understood by Radix's asset architecture. As such, they benefit from native accounting of token balances and direct account transfers that avoid the need for complex smart contract interactions. Radix NFTs integrate with other Radix components for managing access control, royalties, decentralized governance, and other functionality. Properties NFT collections on Radix are defined in Scrypto as non-fungible resources with the following properties: - Supply - The maximum number of NFTs that can be minted - Authorization - Who is allowed to mint new NFTs - Metadata schema - The structure of metadata attached to each NFT - Transferability - Whether NFTs can be transferred between accounts Once published to the network, NFTs can be minted by calling the resource's minting function. Each minted NFT receives a unique ID, which combined with the resource address provides a globally unique identifier. Finally, newly minted NFTs are transferred to the recipient's account. The network natively tracks ownership balances of non-fungible resources. No custom token standards or separate smart contract logic is needed. Ownership can be programmatically restricted using mechanisms like soulbound NFTs. Managing NFT Ownership A key advantage of Radix for NFTs is how token ownership is natively tracked and managed at the protocol level. On Ethereum, NFT ownership exists only within smart contracts, which maintain their own separate ledger of who owns each NFT. To transfer an NFT, you must call the smart contract's functions. On Radix, NFT ownership is directly recorded in user accounts, just like token balances. No separate contract is required. This account-based ownership enables simple and secure NFT transfers. To send an NFT, you make an account-to-account transaction just like transferring XRD. Ownership is updated instantly. NFT transfers can also be restricted. For example, by making an NFT resource "non-transferrable", it becomes soulbound to the original recipient's account. This is useful for NFT tickets, certificates, or identity badges. Built-in ownership tracking removes friction and vulnerabilities compared to smart contract approaches. All NFT balances are visible on-ledger, and it is easy to prove ownership or authorize use via transaction attachments. Advanced access control features like transfer hooks and reclaimable NFTs can also be implemented in Scrypto. Advanced NFT Features While Radix provides core NFT functionality, Scrypto enables developers to build advanced features: - NFTs can integrate with on-chain access-control badges that confer access to on-chain rewards or real world events. - Minting rules and royalty structures for NFT collections can be governed on-chain via proposals and voting by DAO members. - NFTs can interoperate with other Radix components such as a payment splitter for royalties or a lending platform for NFT collateralization. - NFT gaming items can be interoperable between metaverse environments because items are native to the base layer. Use Cases The flexibility of Radix makes it well-suited for diverse NFT use cases, from art to identity management. Digital Art - Representing collectible artwork or music as verifiable digital assets - Managing artwork ownership and provenance on-ledger - Integrating with marketplaces, social platforms, and metaverse environments - Automating royalties to artists on secondary sales via payment splitters Identity and Access Management - Soulbound NFTs as unforgeable credentials or badges - Managing permissions to access apps, venues, gated content - KYC and reputation systems based on NFT credentials - Interoperability with Radix Execute for dynamic permissions Ticketing - Tokenizing event tickets and tracking resales - Putting collectible memorabilia directly into fans' hands - Coding specialized access like seating zones into NFT tickets - Time-expiring tickets for conferences or transportation Gaming - In-game items and assets as user-owned NFTs rather than centralized game inventory - Composability with DeFi for true in-game economies - Metaverse interoperability of game items across worlds - Player identities and profiles as soulbound NFTs ## $XRD Token URL: https://radix.wiki/contents/tech/core-protocols/xrd-token Updated: 2026-02-03 $XRD is the native cryptocurrency and utility token of the Radix network. It exists as a monetary asset, with its primary utility being to pay transaction fees, secure the network via staking (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) , and participate in the ecosystem’s various governance mechanisms. Etymology The naming of $XRD follows international rules ISO 4217 (https://www.iso.org/iso-4217-currency-codes.html) and ISO 3166 (https://www.iso.org/iso-3166-country-codes.html) , which prescribe that commodities and currencies not backed by nations begin with X and national country codes may not begin with an X, respectively. Utility $XRD has two primary purposes within the Radix ecosystem: Staking $XRD is a key component of Radix's Delegated Proof of Stake (DPoS) (/contents/tech/core-concepts/delegated-proof-of-stake-dpos) system, which utilizes $XRD to secure the network against potential attacks like Sybil attacks. In the DPoS system, $XRD holders can "stake" or delegate their tokens to validator nodes they wish to support. The top 100 validator nodes with the highest amount of staked $XRD are selected to participate (https://learn.radixdlt.com/article/what-is-the-xrd-token) in ordering transactions and achieving consensus. Stakers are incentivized to participate by earning a portion of the newly minted $XRD from network emission rewards (https://learn.radixdlt.com/article/what-is-the-xrd-token) . The upcoming Xi’an (/contents/tech/releases/radix-mainnet-xian) network upgrade will remove the limit of 100 validators. Transaction Fees All transactions on the Radix network require a fee paid in $XRD (https://learn.radixdlt.com/article/how-do-transaction-fees-work-on-radix) . The transaction fees are intended to prevent spam transactions by imposing a small economic cost. Notably, 50% of the base network fee portion of each transaction is permanently burned (https://learn.radixdlt.com/article/how-do-transaction-fees-work-on-radix) and removed from circulation. This deflationary pressure is aimed at controlling $XRD supply over time. Transaction fees on Radix are calculated based on the following components (https://learn.radixdlt.com/article/how-do-transaction-fees-work-on-radix) : - A fee table that assigns "CostUnits" to each operation in the transaction based on computation complexity and storage requirements. - A fiat value multiplier that converts the CostUnits to an $XRD fee amount based on the $XRD/USD market price. - Any optional "tip" multipliers specified by the user to increase priority during high demand. - Any royalty fees owed to developers for use of their deployed code/components. The total fee is the sum of the network's assessed CostUnits portion converted to $XRD, plus royalties, multiplied by any tip multipliers. Fees can be paid from any wallet/vault containing $XRD (https://learn.radixdlt.com/article/how-do-transaction-fees-work-on-radix) , not just the transaction signer's wallet. Transactions can have fees covered by multiple parties "chipping in" portions. Fee Distribution Of the base network fee portion (excluding royalties), 50% is permanently burned (https://learn.radixdlt.com/article/how-do-transaction-fees-work-on-radix) , 25% goes to the validator leader for that round, and 25% is distributed among all validators at epoch boundaries based on participation. Any royalty fees (https://learn.radixdlt.com/article/how-do-transaction-fees-work-on-radix) get paid out to the developer's specified vault. Tip fees go directly to the validator leader for that round. Keeping Fees Stable To keep fees relatively stable in fiat terms, the network can adjust the fiat value multiplier (https://learn.radixdlt.com/article/what-is-the-radix-networks-policy-for-network-fee-and-network-emissions-updates) used for fee calculations if the $XRD/USD price deviates significantly based on a policy. Other Uses As the base currency of Radix, $XRD can potentially be used in various ways within the Radix DeFi (/contents/tech/core-concepts/decentralized-finance-defi) ecosystem, such as: - Collateral for lending/borrowing protocols such as Weft (/ecosystem/weft-finance) . - Trading pair facilitating exchange between other tokens on exchanges such as Ociswap (/ecosystem/ociswap) . - Payment (https://learn.radixdlt.com/article/what-is-the-xrd-token) currency for dApp services and usage fees. Supply and Allocation $XRD has a maximum supply cap (https://learn.radixdlt.com/article/how-was-the-xrd-token-allocated) of 24 billion tokens. This maximum supply is divided into two portions: Initial Allocation at Genesis (12bn $XRD) At the genesis launch of the Radix network in July 2021, 12bn $XRD (50% of maximum supply) was issued and allocated as follows (https://learn.radixdlt.com/article/how-was-the-xrd-token-allocated) : - 642m $XRD (2.7%) to participants of the October 2020 public token sale - 3bn $XRD (12.5%) to early members of the Radix community - 200m $XRD (0.8%) to liquidity mining program participants - 2.4bn $XRD (10%) to RDX Works Ltd (/ecosystem/rdx-works) , the company developing Radix - 2.158bn $XRD (9%) to Radix Tokens (Jersey) Limited (/ecosystem/radix-foundation) , a subsidiary managing the Radix ecosystem - 600m $XRD (2.5%) reserved for developer incentives - 600m $XRD (2.5%) reserved as network subsidy - 2.4bn $XRD (10%) locked as a stable coin reserve Network Emission over 40 Years (12bn $XRD) The other 12bn $XRD (50% of maximum supply) is being gradually released (https://learn.radixdlt.com/article/what-is-the-xrd-token) through new mint emissions at a rate of approximately 300m $XRD per year over approximately 40 years. This network emission serves as an incentive to validators and stakers securing the Radix network via the DPoS mechanism. The current circulating supply consists of the 9.6 billion unlocked $XRD from the initial allocation, plus the network emissions released to date. The 2.4 billion stable coin reserve remains indefinitely locked. Distribution # $XRD Percentage / Exc. Network Emissions Network Emissions 12,000,000,000 50% Early Investors 3,000,000,000 12.5% / 25% Stable Coin Reserve 2,400,000,000 10% / 20% Team 2,400,000,000 10% / 20% Radix Foundation 2,148,000,000 8.95% / 17.9% Public Sale 642,000,000 2.675% / 5.35% Developer Incentives 600,000,000 2.5% / 5% Network Subsidy 600,000,000 2.5% / 5% Liquidity Incentives 200,000,000 0.8% / 1.6% Marketing 10,000,000 0.04% / 0.08% TOTAL 24,000,000,000 100% / 100% Price-Based Token Unlock Main article: 2021 Token Unlock Prior to September 15, 2021, a portion of the $XRD and $eXRD token supplies allocated to the Radix community, company founders, and Radix Foundation were subject to a price-based unlocking mechanism (https://learn.radixdlt.com/article/what-was-radixs-price-based-unlock-mechanism) . Under this model, only a baseline percentage of these allocated tokens began as unlocked and able to be traded. The remaining portion would then become unlocked incrementally, with 5% more unlocked for every $0.02 increase in the $XRD/eXRD price, up until a price of $0.43 where 100% would be unlocked. This mechanism was put in place as a means to incentivize long-term commitment and retain tokens during Radix's early growth stages. However, after surveying the community, the unlocking thresholds were removed on September 15, 2021 (https://learn.radixdlt.com/article/what-was-radixs-price-based-unlock-mechanism) , resulting in 100% of tokens (except the stable coin reserve) becoming immediately unlocked and transferable at that point. Stable Coin Reserve As part of the initial $XRD allocation at the Radix network genesis in July 2021, 2.4bn $XRD (10% of the maximum supply) was set aside in a " Stable Coin Reserve (https://learn.radixdlt.com/article/how-was-the-xrd-token-allocated) " and remains indefinitely locked. Purpose The Stable Coin Reserve was created with the purpose of potentially helping to bootstrap one or more decentralized stablecoin projects (https://learn.radixdlt.com/article/what-is-the-radix-stable-coin-reserve) native to the Radix ecosystem. Reliable and low-volatility stablecoins are considered an important component for decentralized finance (DeFi) applications, providing a stable store of value. While decentralized stablecoins provide benefits, their peg to external assets like the US dollar can also introduce systemic risks. The Stable Coin Reserve aims to mitigate these risks by provisioning a locked capital pool (https://learn.radixdlt.com/article/what-is-the-radix-stable-coin-reserve) that could be utilized if needed. Policy The Radix Foundation has a 10-year period (https://learn.radixdlt.com/article/what-is-the-radix-stable-coin-reserve) starting from the July 2021 network launch to review and potentially disburse the Stable Coin Reserve to support underlying protocols for Radix-based stablecoins. However, use of the reserve is not guaranteed - it will remain permanently locked and burned after the 10-year period if not utilized. The 2.4 billion $XRD in the reserve is not included in circulating $XRD supply metrics (https://learn.radixdlt.com/article/what-is-the-xrd-token) during this period. Wrapped $XRD ($eXRD) $eXRD is a wrapped representation (https://learn.radixdlt.com/article/what-is-the-exrd-token) of the $XRD token that was deployed on the Ethereum network (https://learn.radixdlt.com/article/what-is-the-exrd-token) as an ERC-20 token. Purpose The primary purpose of $eXRD is to facilitate the use of $XRD (https://learn.radixdlt.com/article/what-is-the-exrd-token) within Ethereum's vibrant DeFi ecosystem of decentralized exchanges, lending/borrowing platforms, yield farming opportunities and more. By having $eXRD represent $XRD on Ethereum, DeFi users can earn yields, provide liquidity, use as collateral etc. all while benefiting from the features of the high-throughput Radix network. $eXRD was also initially created and distributed prior to $XRD staking going live, as a means to further decentralize the distribution of $XRD tokens (https://learn.radixdlt.com/article/what-is-the-exrd-token) . Supply The circulating supply of $eXRD is not fixed (https://learn.radixdlt.com/article/what-is-the-exrd-token) , as it fluctuates based on how many $XRD tokens are wrapped into $eXRD form or unwrapped back into native $XRD. However, the maximum potential supply of $eXRD can never exceed the total supply of $XRD at any given time since each $eXRD is backed 1:1 by an $XRD token locked in reserve custody. Policy for Fee/Emission Updates The Radix network has established policies to adjust transaction fee rates and validator node emissions in response to changes in the $XRD/USD price and variations in epoch durations. Fee Update Policy The network fee table utilizes a " fiat value multiplier (https://learn.radixdlt.com/article/how-do-transaction-fees-work-on-radix) " that converts operational "CostUnits" to an $XRD fee amount based on the current $XRD/USD market price. Radix Publishing monitors the 7-day simple moving average of the $XRD/USD price. If this deviates by a factor of 2 (0.5x or 2x) from the current multiplier price, they target publishing a network update (https://learn.radixdlt.com/article/what-is-the-radix-networks-policy-for-network-fee-and-network-emissions-updates) with a new fee table multiplier within 2 weeks, based on the new 7-day average price. Emission Update Policy The rate of new $XRD emissions awarded to validators and stakers is targeted at 300 million $XRD per year (https://learn.radixdlt.com/article/what-is-the-xrd-token) . However, the actual emission rate fluctuates based on the duration of epochs on the network. Radix Publishing observes the 30-day moving average of epoch durations. If this deviates by a factor of 1.2 (0.83x or 1.2x) from the current duration assumption, they target publishing a network update (https://learn.radixdlt.com/article/what-is-the-radix-networks-policy-for-network-fee-and-network-emissions-updates) within 2 weeks to adjust the per-epoch emission rate to bring it back in line with the 300 million $XRD per year target based on the new 30-day average epoch duration. The updates aim to keep fees reasonably stable (https://learn.radixdlt.com/article/what-is-the-radix-networks-policy-for-network-fee-and-network-emissions-updates) in fiat terms and emissions consistent with the planned annual schedule, without attempting to compensate for past over/under deviations. ## Unspent Transaction Output (UTXO) Model URL: https://radix.wiki/contents/tech/core-concepts/unspent-transaction-output-utxo-model Updated: 2026-02-03 The Unspent Transaction Output (UTXO) model is a way to track ownership of digital assets in cryptocurrency systems. The Radix Engine (/contents/tech/core-protocols/radix-engine) implements a novel UTXO architecture that enables unique features around scalability, asset universality, transaction concurrency, and accounts, while preserving the core integrity of UTXO-based accounting. Classic UTXO Model The unspent transaction output (UTXO) model was popularized by Bitcoin as a way to track ownership of coins in a distributed ledger. In this model, the ledger consists of a set of UTXOs representing coins that can be spent. Each UTXO has a value and a locking script that defines the conditions required to spend it. When a transaction occurs on the Bitcoin network, it consumes UTXOs by satisfying their locking script conditions and creates new UTXOs with updated owners and values. Copy Example: If Alice has a 2 BTC UTXO, she can create a transaction consuming that UTXO and creating two outputs: ➡️ 1.5 BTC UTXO assigned to Bob ➡️ 0.5 BTC change UTXO back to herself.The key rules of the classical UTXO model are: - Coins are represented exclusively as UTXOs on the ledger. The set of available spendable coins equals the set of unspent UTXOs. - UTXOs are created via mining rewards and transaction outputs. New UTXOs come into existence when transactions produce change. - UTXOs are consumed and destroyed when used as transaction inputs. This transfers ownership by assigning new UTXOs to new owners. - Transactions specify input UTXOs to consume and output UTXOs to create in a single atomic operation. - Consensus rules validate transactions verify ownership of input UTXOs via satisfaction of their locking scripts. - The ledger is represented as the chain of transactions with each transaction changing UTXO state. This model provides a complete historical record showing the full chain of custody for every coin on the network. The UTXO architecture offers useful integrity properties around ownership and prevents double spending without a central authority. Radix Engine UTXO model The Radix Engine implements a UTXO model that makes several optimizations tailored for scalability and performance of decentralized applications. Some key properties of the Radix Engine UTXO architecture: - Assets are defined as universal resources rather than being tied to specific smart contracts or accounts. Resources can be freely used across components and shards. - Components group all their owned resources into a single UTXO. This allows maximum throughput for each component on its shard. - Accounts are components, so each user's balance is represented by an account-specific UTXO. - Transactions specify intent rather than directly specifying input/output UTXOs. The protocol dynamically selects UTXOs to satisfy intent. - Validity relies on digital signatures proving ownership of resources rather than locking script conditions. - Transferring ownership is done by constructing/destructing resources rather than a locking model. This provides inherent atomicity. - Concurrent double spends are resolved by aborting one transaction rather than rejecting both. - The ledger tracks transactions that transform state by constructing/destructing resources. With these properties, the Radix Engine UTXO model offers several key benefits: - Assets are universal across the ledger, enhancing composability between dApps. - Component grouping into single UTXOs maximizes shard throughput. - Specifying intent avoids transaction conflicts, increasing concurrency. - Validity via signatures has low coordination overhead. - Construction/destruction enables atomicity without locks. - Selective transaction abortion increases throughput. ## Trust Boundary URL: https://radix.wiki/contents/tech/core-concepts/trust-boundary Updated: 2026-02-03 A Trust Boundary [ /trʌst ˈbaʊndəri/ ] is a concept used in distributed systems to describe the point beyond which a component must rely on the behavior and integrity of another. Trust boundaries are important because they define the limits of a system's security and control. Overview In the context of blockchains and distributed ledgers, trust boundaries are the reason why assets cannot be transferred between different chains and instead require workarounds such as burn and mint bridges (https://docs.chainswap.com/mechanism/chainswap-architecture) . Radix has its own trust boundary and is designed to obviate the need for asset swaps or sidechains by being linearly scalable. Trust Boundaries in Distributed Systems In the context of distributed systems, a trust boundary can take the form of a physical boundary, such as a network firewall or user authentication system. It could also be a logical boundary, such as an interface between two separate software components or a boundary between different levels of privilege within a system. Regardless of its form, a trust boundary serves as a dividing line between different parts of a distributed system. On one side of the boundary, the system can be trusted to behave as expected and to enforce security controls. On the other side, the system's behavior is less certain and may be subject to malicious or unintended actions. One of the key challenges of managing trust boundaries in distributed systems is ensuring that they are properly enforced. This can be difficult because distributed systems are often composed of many different components that may be developed and maintained by different teams or organizations. As a result, it can be difficult to ensure that all components are properly configured and secure, and that they all adhere to the same security standards. To address this challenge, many organizations rely on strict security policies and procedures, as well as security tools and technologies, to enforce trust boundaries and protect their distributed systems. This can include things like encryption, authentication, and access controls, as well as regular security assessments and vulnerability testing. Overall, trust boundaries are an important concept in the field of distributed systems. They help to define the limits of a system's security and control, and determine which parts of a system can be trusted to behave as expected. By properly enforcing trust boundaries, organizations can help to protect their distributed systems from security threats and ensure their reliability and integrity. ## Transactions Per Second (TPS) URL: https://radix.wiki/contents/tech/core-concepts/transactions-per-second-tps Updated: 2026-02-03 Transactions per second (TPS) is a key metric of the speed and scalability of payment and distributed ledger (DLT) networks. High TPS enables networks to support demanding real-world transactional loads without congestion. Overview Transactions per second (TPS) refers to the number of transactions a network can process each second. It is the most common measurement used to benchmark the performance and scalability of payment and DLT networks. However, TPS is not the only way to measure transaction throughput. Other metrics include: - Transactions per minute (TPM) - Used when transactions are complex and take longer than a second to process. - Peak throughput - The maximum TPS achieved under ideal conditions. Gives insight into potential capacity. - Sustained throughput - The average TPS maintained over time under real-world conditions. Measures actual performance. - Latency - The time for a transaction to be processed and confirmed. Faster latency allows higher TPS. - Concurrency - The number of transactions processed in parallel. Higher concurrency can increase TPS. - Scalability - How throughput changes with increased load. Linear scalability means doubling nodes doubles TPS. TPS can be measured by benchmarks and tests that simulate transaction loads. Common benchmarks for blockchains involve transferring assets between accounts or executing smart contract functions. The transactions sizes, complexity, and submission rates are parameterized to evaluate different scenarios. When comparing TPS across networks, it is important to understand the assumptions and conditions of each test. Results can vary significantly based on factors like network size, geography, hardware, latency, consensus mechanism, transaction types, and concurrency. As with any benchmark, real-world performance may differ. TPS Limits of Early Blockchains Early blockchain networks like Bitcoin and Ethereum had limitations in transactions per second due to their consensus mechanisms and network configurations. Bitcoin uses a proof-of-work consensus based on mining, which allows a maximum of around 7 TPS. This is far below the volume of transactions handled by mainstream payments networks. Ethereum improved on Bitcoin's performance through shorter block times, achieving around 15 TPS. However, as usage increased, the Ethereum network became congested and transaction fees spiked. These limitations of early blockchains stem from tradeoffs that favored decentralization and security over scalability. However, poor TPS presents challenges for adoption, especially for decentralized finance applications with high transaction volumes. Radix TPS Radix is a layer 1 decentralized network designed to achieve extremely high transaction speeds. It utilizes a sharded consensus protocol called Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) to enable linearly scalable transaction throughput. Tests of Radix under various conditions have achieved over 1.4 million TPS (https://www.radixdlt.com/blog/replaying-bitcoin) , significantly faster than other blockchain networks like Ethereum or Solana. Radix aims to support DeFi applications with transaction loads comparable to mainstream payment processors like Visa or PayPal. ## Staking URL: https://radix.wiki/contents/tech/core-concepts/staking Updated: 2026-02-03 Staking is a crucial mechanism in many cryptocurrency (https://www.coinbase.com/learn/crypto-basics/what-is-staking) systems, particularly those using Proof of Stake (PoS) or Delegated Proof of Stake (DPoS) consensus protocols. It involves participants locking up their cryptocurrency tokens (https://www.coinbase.com/learn/crypto-basics/what-is-staking) to support the operation and security of a blockchain network. In return for their contribution, stakers typically earn rewards in the form of additional tokens. Overview The concept of staking serves multiple purposes within a blockchain ecosystem. Primarily, it acts as a Sybil protection mechanism (https://learn.radixdlt.com/article/what-kind-of-staking-system-does-radix-use) , ensuring the network's security by making it economically unfeasible for malicious actors to gain control. Additionally, staking encourages token holders to actively participate in network governance and operation. One notable implementation of staking is found in the Radix network (https://learn.radixdlt.com/article/start-here-radix-staking-introduction) , which utilizes a Delegated Proof of Stake (DPoS) system. Radix's approach incorporates unique features such as native liquid staking (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) and Liquid Stake Units (LSUs) (/contents/tech/core-concepts/liquid-stake-units) , allowing for greater flexibility and utility of staked assets within its ecosystem. As cryptocurrencies and blockchain technologies continue to evolve, staking has become an increasingly important aspect of many networks, offering a more energy-efficient alternative to Proof of Work systems while providing incentives for network participation and security. How Staking Works Staking is a fundamental process in many cryptocurrency systems, particularly those utilizing Proof of Stake (PoS) or Delegated Proof of Stake (DPoS) consensus mechanisms. Understanding how staking works requires familiarity with these underlying concepts: General Principles of Staking At its core, staking involves participants "locking up" their cryptocurrency tokens (https://www.coinbase.com/learn/crypto-basics/what-is-staking) to support the operation of a blockchain. This process serves multiple purposes: - Network Security: By requiring participants to have a financial stake in the network, staking discourages malicious behavior (https://learn.radixdlt.com/article/what-kind-of-staking-system-does-radix-use) and attacks. - Consensus Participation: Staked tokens often grant the right to participate in the network's consensus process, validating transactions and creating new blocks. - Reward Generation: Stakers earn rewards (https://learn.radixdlt.com/article/how-do-token-holders-earn-and-claim-xrd-emissions-rewards) , typically in the form of additional tokens, for their contribution to the network's operation and security. Proof of Stake (PoS) Consensus Mechanism Proof of Stake is an alternative to the energy-intensive Proof of Work (PoW) system used by cryptocurrencies like Bitcoin. In a PoS system: - Validators are chosen to create new blocks based on the amount of cryptocurrency they have staked. - The likelihood of being selected as a validator is generally proportional to the amount staked. - If validators behave dishonestly, they risk losing their staked tokens (https://www.forbes.com/advisor/in/investing/cryptocurrency/what-is-staking-in-crypto/) , providing a strong incentive for proper behavior. Delegated Proof of Stake (DPoS) DPoS is a variation of PoS that aims to increase efficiency and scalability. Key features of DPoS include: - Token holders vote for a limited number of validator nodes (https://learn.radixdlt.com/article/how-are-validators-on-radix-selected) to represent them in the consensus process. - Validators are responsible for creating blocks and maintaining network security. - Rewards are typically shared between validators and their delegators (https://learn.radixdlt.com/article/how-xrd-staking-emissions-rewards-are-calculated-for-token-holders) . For example, the Radix network utilizes a DPoS system (https://learn.radixdlt.com/article/what-kind-of-staking-system-does-radix-use) where token holders can delegate their stake to validator nodes. The network then selects the top 100 validators by delegated stake (https://learn.radixdlt.com/article/how-are-validators-on-radix-selected) to participate in consensus. In both PoS and DPoS systems, the staking process typically involves: - Choosing a validator or staking pool - Locking up tokens for a specified period - Participating in network consensus (directly or via delegation) - Earning rewards based on network participation and performance It's important to note that while staking offers potential rewards, it also comes with risks such as lockup periods and potential loss of rewards due to validator underperformance (https://www.forbes.com/advisor/in/investing/cryptocurrency/what-is-staking-in-crypto/) . Therefore, careful consideration and research are crucial when participating in staking activities. Purpose and Benefits of Staking Staking serves several crucial purposes in blockchain networks and offers various benefits to both the network and its participants. The key aspects are: Network Security One of the primary purposes of staking is to enhance network security: - Sybil Resistance: Staking acts as a Sybil protection mechanism (https://learn.radixdlt.com/article/what-kind-of-staking-system-does-radix-use) , making it economically unfeasible for malicious actors to gain control of the network. By requiring participants to have a significant financial stake, the cost of attacking the network becomes prohibitively high. - Aligned Incentives: Stakers are incentivized to act in the network's best interests (https://www.coinbase.com/learn/crypto-basics/what-is-staking) because their staked assets are at risk if they behave maliciously. This alignment of incentives contributes to the overall security and stability of the blockchain. - Decentralization: In systems like Radix's Delegated Proof of Stake (DPoS) (https://learn.radixdlt.com/article/how-should-i-choose-validators-to-stake-to) , staking encourages a diverse set of validators, promoting decentralization and reducing the risk of centralized control. Consensus Participation Staking is integral to the consensus process in Proof of Stake (PoS) and Delegated Proof of Stake (DPoS) systems: - Validator Selection: In DPoS systems like Radix, staking determines which nodes become validators (https://learn.radixdlt.com/article/how-are-validators-on-radix-selected) . Token holders stake their assets to vote for validators, ensuring that the most trusted and capable nodes participate in consensus. - Block Production: Validators selected through staking are responsible for creating new blocks and validating transactions, maintaining the blockchain's integrity and progress (https://www.coinbase.com/learn/crypto-basics/what-is-staking) . - Efficient Consensus: Staking-based consensus mechanisms are generally more energy-efficient than Proof of Work systems, allowing for faster transaction processing and greater scalability. Rewards and Incentives Staking provides significant benefits to participants in the form of rewards: - Passive Income: Stakers earn rewards for their contribution to the network (https://learn.radixdlt.com/article/how-do-token-holders-earn-and-claim-xrd-emissions-rewards) , typically in the form of additional tokens. This creates an opportunity for passive income generation. - Network Emissions: Many networks, including Radix, distribute newly created tokens (emissions) to stakers (https://learn.radixdlt.com/article/how-do-xrd-emissions-work) as a reward for their participation. - Compounding Returns: In systems like Radix, staking rewards are often automatically restaked (https://learn.radixdlt.com/article/will-my-staking-emissions-rewards-be-automatically-staked) , allowing for compound growth of staked assets over time. - Validator Fees: In some networks, validators can set fees (https://learn.radixdlt.com/article/what-are-validator-fees) , providing an additional incentive for node operators to maintain high-quality infrastructure. Additional Benefits - Governance Participation: In many systems, staking grants voting rights, allowing participants to have a say in the network's governance and future development. - Network Support: By staking, participants actively contribute to the network's operation and growth, supporting the blockchain ecosystem they believe in. - Liquid Staking: Some networks, like Radix, offer native liquid staking (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) , allowing stakers to maintain liquidity of their staked assets through tokenized representations (e.g., Liquid Stake Units or LSUs). While staking offers numerous benefits, it's important to note that it also comes with considerations such as lockup periods and the potential for losses due to validator underperformance (https://www.forbes.com/advisor/in/investing/cryptocurrency/what-is-staking-in-crypto/) . Participants should carefully research and understand these aspects before engaging in staking activities. Staking on Different Blockchain Networks While the core concept of staking remains similar across various blockchain networks, the implementation details, rewards, and processes can differ significantly. Here's an overview of staking on some major networks, with comparisons to Radix where applicable: Ethereum Ethereum, one of the largest blockchain networks, transitioned from Proof of Work to Proof of Stake (https://www.coinbase.com/learn/crypto-basics/what-is-staking) with the Ethereum 2.0 upgrade: - Minimum Stake: Ethereum requires a minimum of 32 ETH to become a validator. - Staking Pools: Due to the high minimum stake, many users participate through staking pools. - Liquid Staking: Unlike Radix's native liquid staking (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) , Ethereum relies on third-party solutions like Lido for liquid staking. Cardano Cardano uses a Proof of Stake consensus mechanism called Ouroboros: - Delegation: Similar to Radix's DPoS system (https://learn.radixdlt.com/article/what-kind-of-staking-system-does-radix-use) , Cardano allows users to delegate their stake to stake pools. - No Minimum Stake: Unlike Ethereum, Cardano doesn't have a minimum staking requirement. - Epoch System: Cardano uses an epoch system for rewards distribution, similar to Radix's epoch-based emissions (https://learn.radixdlt.com/article/how-do-xrd-emissions-work) . Solana Solana is known for its high-speed transactions and uses a novel consensus mechanism called Proof of History in conjunction with Proof of Stake: - Validator Requirements: Solana has high hardware requirements for validators, potentially limiting decentralization compared to Radix's more inclusive validator system (https://learn.radixdlt.com/article/how-do-i-run-a-validator-on-radix) . - Delegation: Users can delegate their stake to validators, similar to Radix's DPoS system. - Short Epochs: Solana's epochs are shorter than many other networks, allowing for more frequent reward distributions. Tezos Tezos uses a Liquid Proof of Stake system: - Delegation: Users can delegate their tokens to "bakers" (validators), similar to Radix's validator delegation (https://learn.radixdlt.com/article/how-are-validators-on-radix-selected) . - Liquid Democracy: Tezos allows delegators to switch between bakers easily, promoting a more dynamic staking ecosystem. Cosmos Cosmos uses a Tendermint consensus mechanism, which is a form of Proof of Stake: - Validator Set: Like Radix's top 100 validators (https://learn.radixdlt.com/article/why-arent-validator-sets-larger-on-radixs-babylon-mainnet) , Cosmos has a limited set of active validators. - Delegation: Users can delegate their tokens to validators. - Slashing: Cosmos implements slashing for validator misbehavior, which is more severe than Radix's emissions-based penalties (https://learn.radixdlt.com/article/what-happens-if-a-validator-node-misses-many-consensus-rounds) . Unique Features of Radix Staking While many networks offer staking, Radix has several unique features: - Native Liquid Staking (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) : Radix issues Liquid Stake Units (LSUs) directly through the protocol, unlike many networks that rely on third-party solutions for liquid staking. - Validator Components (https://learn.radixdlt.com/article/what-is-a-validator-component) : Radix uses special system components for staking, providing additional security and flexibility. - Emission-based Penalties (https://learn.radixdlt.com/article/how-xrd-staking-emissions-rewards-penalties-are-calculated-general) : Instead of slashing stake, Radix penalizes underperforming validators by withholding emissions. - Automatic Compounding (https://learn.radixdlt.com/article/will-my-staking-emissions-rewards-be-automatically-staked) : Staking rewards on Radix are automatically restaked, maximizing returns for participants. While each network has its unique approach to staking, they all share the common goal of incentivizing network participation and security. The choice between networks often depends on factors such as ease of use, reward rates, and specific features that align with a user's needs and preferences. Staking on Radix Radix Staking Overview Radix utilizes a Delegated Proof of Stake (DPoS) system (https://learn.radixdlt.com/article/what-kind-of-staking-system-does-radix-use) for its consensus mechanism. This system allows XRD token holders to participate in network security and earn rewards by delegating their tokens to validator nodes. Key features of Radix staking include: - Native Liquid Staking: Radix implements native liquid staking (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) , allowing stakers to maintain liquidity of their staked assets. - Validator Selection: The community of XRD token holders effectively chooses the set of validators (https://learn.radixdlt.com/article/start-here-radix-staking-introduction) that operate the Radix Public Network through their staking choices. - Emissions-Based Rewards: Stakers earn rewards through network emissions (https://learn.radixdlt.com/article/how-do-xrd-emissions-work) , with approximately 300 million XRD emitted per year. Radix Validator Components Radix uses special validator components (https://learn.radixdlt.com/article/what-is-a-validator-component) for staking: - System Components: These are smart contract components native to the Radix Protocol that cannot be modified by anyone. - Stake Management: When a node-runner registers as a validator, a validator component is automatically created to manage staking for that node. - Security: Validator components ensure that staked XRD cannot be stolen or misused by node operators (https://learn.radixdlt.com/article/what-happens-to-my-tokens-when-i-stake-them-could-the-node-i-delegate-to-steal-them) . Liquid Stake Units (LSUs) Liquid Stake Units (LSUs) (/contents/tech/core-concepts/liquid-stake-units) are a unique feature of Radix staking: - Representation: LSUs represent a proportional claim on the pool of XRD tokens staked with a particular validator. - Transferability: LSUs are fully liquid and can be exchanged for other tokens in Radix's DeFi ecosystem. - Validator-Specific: Each validator has its own unique LSU token (https://learn.radixdlt.com/article/why-are-liquid-stake-units-lsus-unique-to-each-validator) , reflecting the performance and fees of that specific validator. Staking Process on Radix The staking process on Radix involves: - Staking: Users stake XRD through the Radix Dashboard (https://learn.radixdlt.com/article/how-does-staking-work-on-radix) , receiving LSUs in return. - Earning Rewards: Stakers earn rewards in the form of additional XRD (https://learn.radixdlt.com/article/how-do-token-holders-earn-and-claim-xrd-emissions-rewards) , which are automatically added to their stake. - Unstaking: To unstake, users return their LSUs (https://learn.radixdlt.com/article/how-does-staking-work-on-radix) and wait for a cooldown period of 2,016 epochs (approximately 7 days) before claiming their XRD. Validator Selection Choosing validators is a critical aspect of Radix staking: - Importance: Validator selection directly affects network security and performance (https://learn.radixdlt.com/article/how-should-i-choose-validators-to-stake-to) . - Criteria: Users are encouraged to consider factors such as validator performance, reliability, and contribution to network decentralization. - Tools: The Radix Dashboard provides information to help users make informed staking decisions (https://learn.radixdlt.com/article/what-tools-are-available-to-token-holders-to-support-them-in-deciding-which-validator-nodes-to-stake-to) . Rewards and Emissions Radix uses an emissions-based reward system: - Emission Rate: Approximately 300 million XRD are emitted annually (https://learn.radixdlt.com/article/how-do-xrd-emissions-work) as staking rewards. - Distribution: Rewards are distributed at the end of each epoch (https://learn.radixdlt.com/article/when-are-xrd-emissions-rewards-distributed) (approximately every 5 minutes). - Validator Fees: Validators can set a fee percentage (https://learn.radixdlt.com/article/what-are-validator-fees) of the rewards they earn. - Compounding: Rewards are automatically restaked (https://learn.radixdlt.com/article/will-my-staking-emissions-rewards-be-automatically-staked) , allowing for compound growth. Radix Validator Set The Radix network maintains a specific validator set: - Top 100 Validators: Only the top 100 validators by delegated stake participate in consensus (https://learn.radixdlt.com/article/how-are-validators-on-radix-selected) . - Dynamic Selection: The validator set is recalculated at the start of each epoch (https://learn.radixdlt.com/article/how-are-validators-on-radix-selected) , allowing for rotation based on stake changes. - Performance Incentives: Validators that miss consensus rounds lose emissions for that epoch (https://learn.radixdlt.com/article/what-happens-if-a-validator-node-misses-many-consensus-rounds) , incentivizing reliable performance. Radix's staking system is designed to promote network security, decentralization, and active participation from token holders. Through its unique features like native liquid staking and validator components, Radix aims to provide a flexible and efficient staking experience for its users. Risks and Considerations While staking offers numerous benefits, it also comes with certain risks and considerations that participants should be aware of: Lockup Periods One of the primary considerations in staking is the lockup or "vesting" period: - Liquidity Constraints: Many staking systems require tokens to be locked for a certain period (https://www.coinbase.com/learn/crypto-basics/what-is-staking) , during which they cannot be transferred or sold. - Unstaking Delay: On Radix, for example, there is a cooldown period of 2,016 epochs (approximately 7 days) when unstaking (https://learn.radixdlt.com/article/how-long-does-it-take-to-convert-lsus-to-xrd-and-unstake) . This delay applies even when using Liquid Stake Units (LSUs). - Opportunity Cost: During lockup periods, stakers may miss out on other investment opportunities or the ability to sell during price fluctuations. Validator Performance Impact The performance of chosen validators can significantly affect staking returns: - Missed Rewards: If a validator misses consensus rounds or underperforms, it can lead to reduced or lost rewards for its delegators (https://learn.radixdlt.com/article/what-happens-if-a-validator-node-misses-many-consensus-rounds) . - Penalties: In some networks, validator misbehavior can result in penalties. While Radix doesn't slash stakes, underperforming validators lose emissions for the affected epoch (https://learn.radixdlt.com/article/how-xrd-staking-emissions-rewards-penalties-are-calculated-general) , impacting delegator rewards. - Validator Selection Importance: This underscores the importance of carefully selecting validators (https://learn.radixdlt.com/article/how-should-i-choose-validators-to-stake-to) based on their reliability and performance history. Market Volatility Staking doesn't insulate participants from the inherent volatility of cryptocurrency markets: - Price Fluctuations: The value of staked assets can change dramatically due to market conditions (https://www.forbes.com/advisor/in/investing/cryptocurrency/what-is-staking-in-crypto/) , potentially offsetting or exceeding staking rewards. - Opportunity Cost During Downturns: In a declining market, stakers may be unable to sell their assets quickly due to lockup periods. - Reward Value Fluctuations: As staking rewards are typically in the form of additional tokens, their value in fiat terms can vary with market conditions. Slashing (on Applicable Networks) While not applicable to Radix, many Proof of Stake networks implement slashing: - Stake Reduction: Validators that violate network rules may have a portion of their stake (and by extension, their delegators' stake) destroyed. - Financial Loss: This can result in direct financial losses for stakers, beyond just missed rewards. - Network-Specific Risks: The severity and conditions for slashing vary between networks, requiring stakers to understand the specific risks of each platform they engage with. Centralization Risks Staking systems can potentially lead to centralization concerns: - Stake Concentration: There's a risk of stake becoming concentrated among a few large validators (https://learn.radixdlt.com/article/how-should-i-choose-validators-to-stake-to) , potentially compromising network decentralization. - Whale Influence: Large token holders ("whales") can have disproportionate influence over the validator set and, by extension, the network. Technical Complexity Staking often involves technical processes that may be challenging for some users: - User Error: Mistakes in the staking process, such as sending tokens to the wrong address, can result in loss of funds. - Platform-Specific Knowledge: Each network has its own staking mechanics, requiring users to learn new processes when engaging with different platforms. Regulatory Uncertainty The regulatory landscape for staking remains uncertain in many jurisdictions: - Legal Status: The legal classification of staking rewards (e.g., as income or capital gains) can be unclear and may vary by region. - Future Regulations: Potential future regulations could impact the viability or profitability of staking activities. Smart Contract Risks For networks that rely on smart contracts for staking (unlike Radix's native validator components (https://learn.radixdlt.com/article/what-is-a-validator-component) ), there are additional considerations: - Code Vulnerabilities: Bugs or vulnerabilities in staking smart contracts could potentially lead to loss of funds. - Auditing Importance: This underscores the importance of using well-audited and secure staking platforms. While these risks and considerations are important to understand, many can be mitigated through careful research, diversification of staked assets, and staying informed about the specific mechanics and performance of chosen validators and networks. As with any investment activity, participants in staking should conduct thorough due diligence and consider their risk tolerance before engaging. Liquid Staking Liquid staking is an innovative approach in the cryptocurrency space that aims to solve the liquidity issues traditionally associated with staking. This concept has gained significant traction due to its ability to allow users to utilize their staked assets while still participating in network security and earning rewards. Definition and Core Concept Liquid staking allows users to stake their cryptocurrency tokens while retaining liquidity (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) through the issuance of derivative tokens that represent their staked assets. These derivative tokens can typically be used in various DeFi applications, providing users with additional utility for their staked assets. Benefits of Liquid Staking - Maintained Liquidity: Users can trade or use their staked assets representation in other DeFi applications, overcoming the traditional lockup constraints of staking. - Dual Yield Potential: By using liquid staking tokens in other DeFi applications, users can potentially earn additional yield on top of their staking rewards. - Increased Capital Efficiency: Liquid staking allows the same assets to simultaneously contribute to network security and participate in other economic activities within the ecosystem. - Lowered Barrier to Entry: It enables users to participate in staking with smaller amounts of tokens, as they can easily exit their position by selling the liquid staking tokens if needed. Radix's Native Liquid Staking Radix implements a unique approach to liquid staking: - Native Protocol Implementation: Unlike many other networks that rely on third-party solutions, Radix provides native liquid staking at the protocol level (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) . - Liquid Stake Units (LSUs): When users stake XRD on Radix, they receive Liquid Stake Units (LSUs) in return (https://learn.radixdlt.com/article/how-does-staking-work-on-radix) . These LSUs represent a claim on the underlying staked XRD and any accrued rewards. - Validator-Specific LSUs: Each validator on Radix has its own unique LSU token (https://learn.radixdlt.com/article/why-are-liquid-stake-units-lsus-unique-to-each-validator) . This allows the value of LSUs to reflect the performance and fees of individual validators accurately. - Full Transferability: LSUs are fully liquid and can be exchanged for other tokens in Radix's DeFi ecosystem (https://learn.radixdlt.com/article/what-is-a-liquid-stake-unit-lsu-and-native-liquid-staking) without unstaking the underlying XRD. Comparison with Other Networks While Radix offers native liquid staking, other networks often rely on third-party solutions: - Ethereum: - Uses third-party solutions like Lido for liquid staking. - Lido issues stETH tokens representing staked ETH. - Unlike Radix's validator-specific LSUs, stETH represents a claim on a pool of staked ETH across multiple validators. - Solana: - Third-party protocols like Marinade Finance provide liquid staking. - These protocols issue tokens like mSOL that represent staked SOL. - Polkadot: - Uses nominated proof-of-stake, where users can nominate validators with their tokens. - Liquid staking is provided by third-party protocols that issue derivative tokens. - Cosmos: - Some Cosmos-based chains have implemented native liquid staking. - Others rely on third-party protocols for liquid staking solutions. Considerations and Risks While liquid staking offers numerous benefits, it's important to consider potential risks: - Smart Contract Risk: For networks relying on third-party solutions, there's a risk of vulnerabilities in the smart contracts managing the liquid staking tokens. - Peg Maintenance: The value of liquid staking tokens should closely track the value of the underlying staked assets. Deviations from this peg can lead to complications. - Centralization Concerns: Popular liquid staking solutions can lead to a concentration of stake, potentially impacting network decentralization. - Regulatory Uncertainty: The regulatory status of liquid staking tokens is still unclear in many jurisdictions. Radix's native implementation of liquid staking mitigates some of these risks (https://learn.radixdlt.com/article/what-happens-to-my-tokens-when-i-stake-them-could-the-node-i-delegate-to-steal-them) , particularly those related to smart contract vulnerabilities and third-party dependencies. However, users should still be aware of general market risks and the specific mechanics of Radix's liquid staking system. Liquid staking represents a significant innovation in the staking ecosystem, enabling users to maximize the utility of their staked assets. As the concept continues to evolve, it's likely to play an increasingly important role in the broader DeFi landscape. ## Sharding URL: https://radix.wiki/contents/tech/core-concepts/sharding Updated: 2026-02-03 Sharding [ /ˈʃɑːdɪŋ/ ] is a method of partitioning a database (https://www.mongodb.com/docs/manual/sharding/) horizontally (https://towardsdatascience.com/database-terminologies-partitioning-f91683901716) across separate servers to improve scalability, performance and data availability. In the context of DLTs, sharding refers to the process of dividing the network's computational and storage workload across multiple smaller groups of nodes (https://www.radixdlt.com/blog/sharding-in-radix) , called shards, each responsible for processing a subset of the network's transactions and storing a portion of the global state. Background The primary goal of sharding in DLTs is to increase the overall throughput and capacity of the network (https://vitalik.eth.limo/general/2021/04/07/sharding.html) without sacrificing decentralization or security. By allowing multiple shards to process transactions in parallel, sharding aims to overcome the limitations of ‘full replication’ where every node must process and store all transactions (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . Sharding has gained significant attention in the blockchain community as a potential solution to the scalability trilemma (/contents/tech/core-concepts/blockchain-trilemma) , which posits that blockchain systems can only achieve two out of three desirable properties: scalability, security, and decentralization. By implementing sharding, projects such as NEAR, MultiversX and Radix aim to maintain high levels of security and decentralization while dramatically improving scalability. The concept of sharding in DLTs extends beyond simply partitioning data; it encompasses complex mechanisms for ensuring data availability, cross-shard communication, and maintaining overall network consistency (https://www.radixdlt.com/blog/breakthrough-in-consensus-theory-scaling-defi-without-breaking-composability) . Technical Fundamentals Sharding is a concept borrowed from traditional database systems and adapted for use in distributed ledger technology. Database Sharding In traditional database systems, sharding is a method for distributing data across multiple machines (https://www.mongodb.com/docs/manual/sharding/) . It involves breaking a large database into smaller, more manageable partitions called shards. Each shard contains a subset of the data and is stored on a separate server. This approach allows for horizontal scaling, where additional servers can be added to increase capacity and performance. There are two main types of database sharding: - Vertical Partitioning: Different tables from the same database are stored in different instances. - Horizontal Partitioning: A database table is split into separate sets of rows, stored in different database instances (https://www.radixdlt.com/blog/what-is-sharding) . Application to Blockchain and Distributed Ledgers In the context of blockchain and distributed ledgers, sharding involves dividing the network's computational and storage workload across multiple smaller groups of nodes (https://www.radixdlt.com/blog/sharding-in-radix) , each responsible for processing a subset of the network's transactions and storing a portion of the global state. The key difference in blockchain sharding is that it must maintain the security and decentralization properties of the network while improving scalability. This involves complex mechanisms for ensuring data availability, cross-shard communication, and maintaining overall network consistency. Radix, for example, implements a unique approach called "pre-sharding" (https://www.radixdlt.com/blog/sharding-in-radix) , where the network launches with a maximum number of shards (2^64 or approximately 18.4 quintillion) already in place. This allows for future scalability without needing to change the fundamental structure of the network as it grows. Key Goals: Scalability, Decentralization, Security The primary objectives of implementing sharding in distributed ledgers are: - Scalability: By allowing parallel processing of transactions across multiple shards, the overall throughput of the network can be significantly increased (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . In theory, this can lead to "quadratic sharding," where the capacity of the network grows quadratically with the computational power of individual nodes. - Decentralization: Sharding aims to maintain or even improve decentralization by allowing the network to run on consumer-grade hardware (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . This is in contrast to some high-throughput chains that rely on a small number of powerful nodes. - Security: A well-designed sharding system should maintain the security properties of the network (https://vitalik.eth.limo/general/2021/04/07/sharding.html) , ensuring that an attacker cannot easily compromise the system by targeting a small subset of shards. The challenge in implementing sharding lies in achieving all three of these goals simultaneously, effectively addressing the scalability trilemma that has long plagued blockchain networks. Types of Sharding Sharding in distributed ledger technology can be implemented in various ways, each focusing on different aspects of the network's operation. The main types of sharding are network sharding, transaction sharding, and state sharding. These approaches can be used individually or in combination to achieve the desired scalability improvements. Network Sharding Network sharding involves dividing the network nodes into smaller groups, each responsible for a subset of the network's tasks (https://www.radixdlt.com/blog/sharding-in-radix) . In this approach, nodes within a shard communicate more frequently with each other than with nodes in other shards. This can reduce the overall network communication overhead and improve efficiency. Radix, for example, implements a form of network sharding where each node maintains as many shards as it can handle, dropping shards that are too resource-intensive (https://www.radixdlt.com/blog/sharding-in-radix) . This allows even low-powered devices to participate in the network, promoting decentralization. Transaction Sharding Transaction sharding involves distributing the processing of transactions across different shards (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . In this approach, each shard is responsible for processing a subset of the total transactions in the network. This allows for parallel processing of transactions, potentially increasing the overall throughput of the network. One challenge with transaction sharding is ensuring that transactions that depend on each other or affect the same state are processed correctly (https://www.radixdlt.com/blog/sharding-in-radix) . Radix addresses this by using a deterministic process to assign transactions to shards based on the public keys involved, ensuring that related transactions are always processed in the same shard. State Sharding State sharding is perhaps the most complex form of sharding. It involves dividing the global state of the network (i.e., the full record of all accounts and their data) across multiple shards (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . Each shard maintains only a portion of the global state, which can significantly reduce the storage and computational requirements for individual nodes. However, state sharding introduces challenges in cross-shard communication and maintaining consistency across the network. Ethereum 2.0's sharding plan (https://vitalik.eth.limo/general/2024/05/23/l2exec.html) , for instance, initially focused on data sharding rather than full state sharding to mitigate some of these challenges. The implementation of state sharding often requires sophisticated techniques such as fraud proofs or zero-knowledge proofs (ZK-SNARKs) to ensure the validity of cross-shard transactions (https://vitalik.eth.limo/general/2021/04/07/sharding.html) and maintain the overall security of the network. Security Considerations While sharding offers significant scalability benefits for distributed ledger systems, it also introduces new security challenges that must be carefully addressed. This section explores the key security considerations in sharded systems and the solutions proposed to mitigate these risks. Single-Shard Takeover One of the primary security concerns in sharded systems is the potential for an attacker to take over a single shard. In a traditional blockchain, an attacker typically needs to control a majority of the network's resources to carry out an attack. However, in a sharded system, an attacker might only need to control a majority within a single shard to cause damage (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . Random sampling for validator assignment (https://vitalik.eth.limo/general/2021/04/07/sharding.html) has been suggested to prevent targeted attacks on specific shards. Data Availability Ensuring data availability is crucial in sharded systems. If data becomes unavailable, it can lead to stalled chains or even allow for ransom attacks on specific user data (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . This is particularly challenging in a sharded environment where not all nodes have access to all data. Proposed solutions are: - Data availability sampling (https://vitalik.eth.limo/general/2021/04/07/sharding.html) to ensure that all necessary data is available without requiring nodes to download entire blocks. - Erasure coding (https://vitalik.eth.limo/general/2021/04/07/sharding.html) to provide redundancy and facilitate data availability checking. Cross-Shard Transaction Atomicity Maintaining atomicity for transactions that span multiple shards is a significant challenge. Ensuring that all parts of a cross-shard transaction are executed correctly or all fail (https://www.radixdlt.com/blog/breakthrough-in-consensus-theory-scaling-defi-without-breaking-composability) is crucial for preserving the integrity of the system. Different projects have proposed various solutions: - Ethereum's proposed "yanking" mechanism (https://vitalik.eth.limo/general/2021/04/07/sharding.html) allows for objects to be moved between shards. - Radix's Cerberus consensus protocol (https://www.radixdlt.com/blog/breakthrough-in-consensus-theory-scaling-defi-without-breaking-composability) implements a "braiding" technique that allows for atomic composability across shards. These mechanisms aim to ensure that cross-shard transactions are executed correctly and atomically, maintaining the consistency of the overall system. Adaptive Adversaries Sharded systems can be vulnerable to adaptive adversaries who can quickly compromise or shut down specific nodes in real-time (https://vitalik.eth.limo/general/2021/04/07/sharding.html) . This poses a particular threat to systems that rely solely on committees for security. History and Development The concept of sharding in distributed ledgers has evolved significantly over time, with various approaches proposed and refined to address the scalability challenges of blockchain networks. This section traces the history and development of sharding from early proposals to modern implementations. Early Proposals (2015-2017) The idea of applying sharding to blockchain systems began to gain traction in the mid-2010s: - In 2015, a team at the National University of Singapore (NUS) proposed an early Byzantine Fault Tolerant (BFT) sharding approach (https://vitalik.eth.limo/general/2021/04/07/sharding.html#what-are-some-moderately-simple-but-only-partial-ways-of-solving-the-scalability-problem) . - Ethereum's Vitalik Buterin started discussing sharding concepts for Ethereum as early as 2016 (https://vitalik.eth.limo/general/2021/04/07/sharding.html#improving-sharding-with-better-security-models) . - Other early conceptual work included proposals like "puzzle towers" by Dominic Williams (https://vitalik.eth.limo/general/2017/12/31/sharding_faq.html#how-do-you-actually-do-this-sampling-in-proof-of-work-and-in-proof-of-stake) , which aimed to combine proof-of-work with sharding. These early proposals often focused on sharding either transaction processing or state, but not both (https://vitalik.eth.limo/general/2021/04/07/sharding.html#what-are-some-moderately-simple-but-only-partial-ways-of-solving-the-scalability-problem) , which limited their potential scalability gains. Evolution of Approaches (2018-2020) As research in sharding progressed, more comprehensive approaches began to emerge: - Radix's initial consensus protocol, Tempo (/contents/tech/research/tempo-consensus-mechanism) , was designed to be sharded from the start (https://www.radixdlt.com/blog/tempo-consensus-lessons-learned) . It used a novel approach called "logical clocks" for consensus. - Zilliqa, launched in 2019, implemented a form of transaction sharding (https://vitalik.eth.limo/general/2021/04/07/sharding.html#what-are-some-moderately-simple-but-only-partial-ways-of-solving-the-scalability-problem) . - The concept of "quadratic sharding" was developed (https://vitalik.eth.limo/general/2021/04/07/sharding.html#sharding-through-random-sampling) , aiming to scale both processing power and storage capacity. - Researchers began to address key challenges such as cross-shard transactions and data availability (https://vitalik.eth.limo/general/2017/12/31/sharding_faq.html#how-can-we-facilitate-cross-shard-communication) . During this period, the focus shifted towards creating more secure sharding models that could maintain decentralization (https://vitalik.eth.limo/general/2021/04/07/sharding.html#improving-sharding-with-better-security-models) . This included the development of techniques like: - Data availability sampling (https://vitalik.eth.limo/general/2021/04/07/sharding.html#scalable-verification-of-data-availability) - Fraud proofs and zero-knowledge proofs for scalable verification (https://vitalik.eth.limo/general/2021/04/07/sharding.html#scalable-verification-of-computation) Modern Implementations (2021-present) Recent years have seen more concrete implementations and refined approaches to sharding: - Ethereum 2.0 (https://vitalik.eth.limo/general/2021/04/07/sharding.html) initially focused on data sharding rather than computation sharding. The original plan involved creating 64 separate shard chains (https://vitalik.eth.limo/general/2021/04/07/sharding.html) , each capable of processing transactions and smart contracts independently. - Other projects like Near Protocol have implemented their own versions of sharding (https://vitalik.eth.limo/general/2021/04/07/sharding.html#recap-how-are-we-ensuring-everything-is-correct-again) . Modern sharding approaches often combine multiple techniques to achieve scalability while maintaining security: - Using committees with random sampling for shard management (https://vitalik.eth.limo/general/2021/04/07/sharding.html#sharding-through-random-sampling) - Implementing cross-shard communication protocols (https://vitalik.eth.limo/general/2017/12/31/sharding_faq.html#how-would-synchronous-cross-shard-messages-work) - Leveraging Layer 2 solutions like rollups in conjunction with sharding (https://vitalik.eth.limo/general/2021/04/07/sharding.html#recap-how-are-we-ensuring-everything-is-correct-again) Sharding in Radix Radix has developed an integrated sharding and consensus architecture specifically designed for hyper-scalability of its decentralized network. In Radix’s case, sharding applies to both data availability and transaction execution as both functions are performed by nodes. Ledger Pre-Sharding The current Radix Mainnet (Babylon) (/contents/tech/releases/radix-mainnet-babylon) is sharded into a fixed number of 2^256 (https://www.talkcrypto.org/blog/2019/04/08/all-you-need-to-know-about-2256/) shards. Responsibility for validating shards is undertaken by groups of validators called shard groups (/contents/tech/core-concepts/shard-groups) , which may grow or shrink dynamically in response to load demand (https://youtu.be/FZWT3j9XHMI) . Currently, the number of shard groups is capped at one but this will be lifted with Radix’s forthcoming Xi’an (/contents/tech/releases/radix-mainnet-xian) release. Pre-sharding is in contrast to the dynamic adaptive state sharding (https://coinmarketcap.com/academy/glossary/adaptive-state-sharding) model adopted by Shardeum, MultiversX, and NEAR, where shards are added incrementally as required. While sharding can improve scalability, an ad hoc approach to sharding leads to substantial difficulties as any changes to the shard structure require reorganizing the entire network - a time consuming and expensive process. The larger the sharded ledger grows, the more problematic this becomes. Ad hoc sharding also complicates queries and data lookups within the ledger. By sharding the data randomly, it becomes much harder to locate specific transactions or data points since they could be stored anywhere. This slows down queries as more extensive searches are required. Deterministic Shard Indexing Shards on Radix are indexed deterministically by public keys. This means that the shard index for any address can be calculated by taking the modulo (https://en.wikipedia.org/wiki/Modulo) of the public key over the shard space. si=mod piSs=shard indexp=public keyS=total shard spacesi​=Smodpi​​s=shardindexp=publickeyS=totalshardspace​ By deterministically grouping related data into the same shard, Radix avoids the need for expensive data reorganization as the network grows. This creates four major advantages: - Proximity: All transactions from a particular account are guaranteed to be in the same shard, which makes it trivial to identify attempted double-spends. - Asynchrony: Transactions from separate accounts will always involve separate shards, enabling asynchronous, parallel processing of unrelated transactions. - Indexing: Lookup complexity and query time are reduced since shard locations can be easily derived from public keys. - Load balancing: Hash sharding typically results in a more uniform distribution of data across nodes. Network Security A key challenge in sharding distributed ledgers is ensuring sufficient security and node coverage across all shards. If some shards have much fewer nodes than others, it creates vulnerabilities. Radix employs several techniques to maintain security across its sharded network: - Node Identity Shard Mapping: To secure the network, validator node addresses are mapped to a single ‘root’ shard. Nodes must permanently maintain their root shards, but can support additional shards to earn more transaction fees. Underserved shards offer higher returns, attracting more validators and preventing any shards from being overlooked. This free market approach maintains security even as the network scales. - Incentives for Multi-Shard Validation: Based on factors like computing resources, validators can choose to support additional shards beyond their root shard. The more shards a node supports, the greater the amount of transaction fees it can earn. This creates an incentive for validators to support as many shards as feasible to maximize profits. In this way, the overall validation workload is distributed across nodes. - Dynamic Shard Support via Free Market: As the network grows, some shards may end up with fewer nodes supporting them compared to other oversubscribed shards. These underserved shards then inherently offer higher potential returns since there is less competition for fees. The higher relative profits attract more validators to begin supporting the underserved shards. This brings coverage back into equilibrium across shards through a free market approach. - Scaling Security Through Staking: In proof-of-stake networks like Radix, staking provides additional security. The more tokens a validator stakes, the more shards it can validate. This allows validation load to scale up securely. High stake validators may validate transactions across many shards in parallel for efficiency. However low stake nodes still play a key role in providing decentralized shard coverage. Together, these mechanisms ensure Radix can securely scale to an exponentially growing shard space without running into coverage gaps or centralization issues. The network organically self-regulates to distribute validation across shards. Cerberus Consensus Main article: Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) Radix's Cerberus consensus protocol introduces ‘braided’ sharding to atomically compose (/contents/tech/core-concepts/atomic-composability) transactions across shards. Cerberus shards transaction validation while braiding validation across shards to enforce system-wide transaction ordering and prevent double-spending. This unique braided architecture ensures that Radix can securely scale transaction throughput across a sharded network of effectively unlimited size. ## Shard Groups URL: https://radix.wiki/contents/tech/core-concepts/shard-groups Updated: 2026-02-03 Shard groups [ /ʃɑrd grups/ ] or Validator Sets on Radix are groups of validators responsible for storing and validating the ledger state on subsets of the Radix shardspace (/contents/tech/core-concepts/sharding) . Unlike the shardspace itself, which is fixed at 2^256 shards, shard groups are dynamic and can adjust their shard coverage according to demand (https://www.radixdlt.com/blog/test-method-part1#:~:text=the%20network%20has%20a%20fixed%20shard%20space%20of%2018.4%20quintillion%20shards%20and%20a%20Node%20can%20operate%20as%20much%20or%20little%20of%20the%20shard%20space%20as%20it%20likes%20(assuming%20it%20has%20enough%20resources)) . Overview Rather than fully replicating state and execution across all validators, shard groups handle validation and consensus for subsets of shards to facilitate scaling while retaining security. Validators within these groups participate in the Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) consensus protocol to validate transactions on their shards. By reaching consensus amongst themselves, they can commit shard state while resisting various failure scenarios. The composition of validators in these sets is determined algorithmically, aiming to balance factors like security, stake distribution, and maximizing decentralization. For example, no single validator should make up more than 33% of the total stake in a set, as this would allow them to potentially compromise consensus. Shard groups Organization One of the key security mechanisms in Radix is the periodic shuffling of validators between different shard groups. This shuffling involves removing validators from their current set that is responsible for a shard subset, and reassigning them to a different shard group handling another shard subset. It works to constantly vary the composition of validators across the different parts of the network. Shuffling serves two main purposes: - Preventing attacks - By shuffling validators, it makes it much more difficult for malicious actors to target specific shards over longer timescales. Even if a shard group was compromised at one point, shuffling ensures that group would get moved across shards over time. This raises the cost and complexity for attackers. - Enhancing decentralization - Randomly reassigning validators also allows stake and participation to be distributed more evenly across all areas of the network. If validators remained static in sets, concentration of stake could occur making certain sets larger targets. The shuffling process creates some overhead, mainly for the validators that get reassigned. These validators need to resynchronize state with the new shard group responsible for those shards. So there is some redundancy and bootstrapping work required. However, this is an acceptable tradeoff to gain the security benefits. Additionally, the quantity of validators shuffled per epoch is limited to keep disruption minimal. For example, only 10-20% might rotate each epoch. This allows the majority of validators to remain in place, so resyncing overhead does not become excessive. Finding an optimal shuffling frequency, quantity of validators to shuffle, and assignment algorithms are still active research questions. The goals are to balance security with performance. Configuration The algorithms and mechanisms for determining shard group composition and assignments represent one of the most complex facets of the Radix network. Managing 2^256 shards requires strategic placement of validators to maximize decentralization and security. Some key areas of focus include: - Quantity of validators per set - Ideal stake distribution per set - Minimum thresholds for preventing collusion - Shuffling cadence between sets - Automated assignment of validators to balance stake distribution Ideally, shard group configurations guarantee sufficient collective stake within each set to make attacks infeasible. However, maximum decentralization is also crucial to avoid concentration of power. Finding optimal formulations to address these sometimes competing priorities is an iterative, ongoing process as the network grows. Utilizing extensive simulations and tests, the Radix research team continues tuning configurations to balance performance needs with security guarantees. Research Ongoing research (/contents/tech/research) focuses on optimizing configurations for shard groups related to size, quantity, and shuffling cadence, with a view to find an ideal balance between security, performance, and decentralization as the network evolves. ## Honest Majority Assumption URL: https://radix.wiki/contents/tech/core-concepts/honest-majority-assumption Updated: 2026-02-03 The Honest Majority Assumption is a critical security premise (https://arxiv.org/abs/1809.05613) that underlies Proof-of-Work (PoW) and Proof-of-Stake (PoS) Sybil defense mechanisms. The assumption states that at any given time, a majority of the mining or staking power in the network is controlled by honest participants who follow the protocol rules correctly. Overview In PoW blockchains like Bitcoin (https://bitcoin.org/bitcoin.pdf) , the assumption is that honest miners controlling the majority of the network's collective hashing power will outpace any minority of malicious miners trying to revise the blockchain history. Similarly, in PoS blockchains, the assumption is that at least two-thirds of the staked cryptocurrency is controlled by honest validators (https://www.researchgate.net/publication/340798012_A_Survey_on_Consensus_Methods_in_Blockchain_for_Resource-constrained_IoT_Networks) who will refuse to validate conflicting blockchain histories. This assumption allows the decentralized consensus mechanism to resist attacks and censorship attempts by any dishonest minority of participants. As long as the assumption holds true, the chain's canonical transaction history remains immutable and indisputable. However, if the assumption is violated and honest participants no longer control a majority of mining/staking power, malicious actors can begin disrupting consensus (https://eprint.iacr.org/2019/752) and the fundamental guarantees provided by the chain. Assumption Details The honest majority assumption is a fundamental security requirement for both Proof-of-Work (PoW) and Proof-of-Stake (PoS) consensus mechanisms used in blockchains: - For PoW, it requires that the majority of the total mining or hashing power is controlled by honest miners (https://eprint.iacr.org/2019/752) who follow the protocol rules correctly. If a single malicious entity gains majority control over the mining power, they can disrupt consensus by censoring transactions or rewriting history. - For PoS, the assumption requires that the majority of the total staked cryptocurrency is controlled by honest validators (https://x.com/fuserleer/status/1799465167899492427) . Validators are chosen in proportion to their stake, so if dishonest validators gain majority stake, they can similarly undermine consensus guarantees. Maintaining an honest majority among consensus participants is critical because it prevents adversaries from disrupting the decentralized consensus process (https://bitcoin.org/bitcoin.pdf) . This enables key properties like censorship resistance, where no single entity can arbitrarily censor or reverse transactions, and immutability, where the blockchain's transaction history cannot be unilaterally rewritten. Violations of the honest majority undermine these security guarantees (https://blog.ethereum.org/2014/01/15/slasher-a-punitive-proof-of-stake-algorithm) that make blockchains robust and decentralized. ## Finite State Machines URL: https://radix.wiki/contents/tech/core-concepts/finite-state-machines Updated: 2026-02-03 Finite State Machines (FSM) are software components that restrict an application’s behavior to a subset of possible states. Overview At its core, an FSM processes inputs and transitions between these states based on a set of predefined rules. Each state represents a particular condition or phase of the system, and the FSM can only be in one state at any given time. By offering a structured approach to system behavior, FSMs ensure predictability and reliability, especially important in mission-critical applications. Current Landscape of DeFi and Challenges Ethereum employs a smart contract model that can be visualized as a "black box" with considerable complexity. An Ethereum Smart Contract (SC) acts as a miniature computer server, running specific code on the network. To interact with these SCs, users or other apps invoke its "methods" through signed messages. While this model affords great flexibility, aligning with Ethereum's vision of a "global computer", it also brings substantial challenges: - Internal Variables: These are integral for a SC's function. For instance, an ERC-20 SC uses internal variables to keep track of token balances for every user. - Complex Transactions: Transactions involving multiple SCs add layers of complexity. Each SC might call methods on the others, leading to a series of internal state changes before a final outcome. Such intricate transactions can be difficult to track and ensure reliability. - Unpredictable Outcomes: Most DeFi breaches on Ethereum arise from the intricacies associated with SCs. With each SC having the power to modify its internal state freely, unforeseen or unpredictable results can emerge, often to the detriment of users. Mission-Critical Systems Such systems are integral where failures might result in dire consequences, such as loss of life or catastrophic financial repercussions. The very nature of these systems necessitates a robust and foolproof control mechanism. Here's where FSMs come into the picture: - FSM in Embedded Systems: FSMs are widely used in mission-critical embedded systems where the predictability of outcomes is paramount. FSMs can effectively process inputs to produce expected outputs based on their current state, ensuring consistent and reliable operations. Role of FSMs in dApps Integrating FSMs into VMs could significantly transform the way dApps operate: - Predictable Correctness: FSMs can provide a structured approach to dApp development, ensuring that apps behave predictably and reliably, even in complex scenarios. - Simplified Transaction Logic: With FSMs, the transaction logic can be streamlined, reducing the cascade of internal state changes and making outcomes more understandable. - Enhanced Security: By limiting the possible states a dApp can be in, FSMs can reduce the potential attack vectors and vulnerabilities commonly exploited in conventional smart contract models. Use of FSMs in Radix Radix has strategically incorporated Finite State Machines (FSM) into its architecture to address specific issues observed in traditional blockchain platforms. The FSM model, by its nature, brings structure and predictability to system interactions. In the context of Radix, this means that dApps have clearer transactional logic. For instance, when transferring assets between parties or implementing multi-step financial protocols, the use of FSM ensures that the dApp proceeds through clearly defined states, reducing the risk of unexpected behaviors. A practical example can be seen in Radix's approach to token exchanges. Instead of allowing a free-form interaction as seen in many Ethereum-based protocols, a Radix dApp built using FSM would follow a more regimented and predictable path: from initiation, to validation, to execution, and finally to completion. This structured methodology inherently reduces complexities and potential vulnerabilities, paving the way for safer and more reliable decentralized applications. ## Decentralized Science (DeSci) URL: https://radix.wiki/contents/tech/core-concepts/decentralized-science-desci Updated: 2026-02-03 Decentralized Science (DeSci) refers to an emerging model of organizing and funding scientific research (https://medium.com/molecule-blog/the-emergence-of-biotech-daos-407e31748cd4) in a more open, collaborative, and decentralized manner powered by Web3 technologies such as blockchain, cryptocurrency tokens, decentralized autonomous organizations (DAOs), and non-fungible tokens (NFTs). Goals The goals and purported benefits (https://future.com/a-guide-to-decentralized-biotech/) of the DeSci model include enabling broader participation in science from different stakeholders like patients and researchers, moving power away from centralized research institutions towards decentralized communities, creating new tokenized funding and incentive models for research, and promoting open access collaboration. Some have referred to DeSci as aiming to create a " creator economy (https://www.molecule.xyz/blog/ip-nfts-for-researchers-a-new-biomedical-funding-paradigm) " for scientists, in the same way that NFTs have created new economic opportunities for artists. The DeSci movement has gained significant momentum over the past couple of years, with a growing number of projects and communities (https://youtu.be/5_ZNsiWhXfU) coalescing around building decentralized infrastructure for science spanning areas like funding, data sharing, open access publishing, governance, and laboratory services. Components Decentralized Funding Models A key component of the DeSci model is experimenting with new decentralized ways of allocating funding and incentives for scientific research, moving beyond traditional sources like government grants. Several DeSci projects have utilized DAOs to coordinate funding from a group of members to support early stage research. For example, VitaDAO is a community of over 5000 scientists and enthusiasts funding longevity research, having supported over $2 million in projects (https://youtu.be/5_ZNsiWhXfU) so far. NFTs representing intellectual property licenses have emerged as another DeSci funding method. Molecule's IP-NFT model (https://www.molecule.xyz/blog/ip-nfts-for-researchers-a-new-biomedical-funding-paradigm) allows research assets to be listed and purchased on a marketplace, with the buyers having certain rights to resulting IP, data access, or royalty streams. Some projects are also experimenting with tokenized incentive models to motivate scientific contributions. Participants can earn and stake governance tokens for active involvement in a DeSci community, influencing decision making on what research gets funded. There is also interest in decentralized matching funds through mechanisms like quadratic funding (https://youtu.be/J1hUafXYchg) , which uses a mathematical formula to allocate donations in a way that rewards projects with the most support from small donors. Decentralized Knowledge Sharing Decentralizing access to scientific knowledge is another major focus area of the DeSci movement. This includes building open access platforms for publishing research as well as enabling more open data sharing. A number of projects are experimenting with blockchain-based solutions for open access scientific publishing (https://youtu.be/J1hUafXYchg) , including decentralized peer review models that incentivize community participation. These platforms aim to make papers and data more freely accessible outside of expensive journal paywalls. Enabling permissionless access to research data (https://youtu.be/J1hUafXYchg) is another priority. Distributed storage networks like Filecoin and Arweave allow researchers to permanently store datasets and control access rights using crypto wallets rather than centralized authorities. Data NFTs can also give fine-grained control over sharing specific datasets. Molecule has introduced a concept of attaching data access rights to IP-NFT research assets (https://www.molecule.xyz/blog/ip-nfts-for-researchers-a-new-biomedical-funding-paradigm) , so purchasers can fund projects in exchange for access to resulting datasets. Data marketplaces are also emerging to allow the exchange of access rights to datasets. Standardized legal frameworks (https://youtu.be/5_ZNsiWhXfU) around areas like licensing and IP ownership rights are being developed to define how decentralized knowledge sharing models comply with regulations. Striking a balance between openness and commercial viability is an ongoing challenge. Decentralized Governance Decentralizing decision-making and control over scientific priorities is another defining goal of the DeSci paradigm shift. This means finding ways to include broader stakeholder participation beyond just centralized research institutions. A number of DeSci projects are working on reputation systems as a way to quantify contributions and grant different levels of governance influence. Metrics assessing community involvement, past research outputs, peer reviews etc. can help decentralize control. Patient groups and other interested community members are participating more actively in biomedical research DAOs (https://medium.com/molecule-blog/the-emergence-of-biotech-daos-407e31748cd4) using token-based voting and proposals to help determine key priorities and funding allocations. For example, members of the VitaDAO community leverage $VITA tokens for governance over which longevity studies get funded. Scientists themselves also have opportunities for more autonomy and ownership by participating in DeSci networks as contributors or stakeholders rather than relying on centralized academic institutions. Avoiding administrative gatekeepers can increase researcher agency and velocity. Decentralized Execution Emerging DeSci models provide more decentralized options for actually executing and operationalizing scientific research, introducing alternatives to relying solely on centralized research institutions. Biotech startups are increasingly leveraging decentralization (https://future.com/a-guide-to-decentralized-biotech/) including sharing lab space in coworking models, hiring talent across borders, and collaborating on projects. Other options mentioned for decentralizing research operations include: - Virtual biotech models (https://future.com/a-guide-to-decentralized-biotech/) that outsource lab processes to contract research organizations (CROs), sponsored research agreements (SRAs), and cloud labs. - LabDAO building a decentralized marketplace (https://youtu.be/J1hUafXYchg) to connect researchers with microCROs offering specific services. - Molecule's IP-NFT model (https://youtu.be/5_ZNsiWhXfU) allowing tokenized ownership or rewards for remote contributors participating across drug development stages. DeSci DAOs Organization Name Description Twitter Discord Telegram Other Links AsteriskDAO (https://asteriskdao.xyz/) Addressing the knowledge and resource gap in women’s non-reproductive health through worldwide research and IP funding. Twitter (https://twitter.com/asteriskdao) Discord (https://discord.gg/saMwDEZgq2) Telegram (https://t.me/asteriskdao) One Pager (https://www.notion.so/asteriskdao/AsteriskDAO-One-Pager-53c3baea512a45b29f7eff9850c12bc1?pvs=4) AthenaDAO (https://www.athenadao.co/) Advancing women’s health research, education, and funding, with a focus on translational R&D. - - - Reproductive Health Report (https://www.athenadao.co/#Report) CannabisDAO (https://cannabis-dao.xyz/) Supporting and developing the cannabis industry using DeSci and blockchain technology. Twitter (https://twitter.com/CANN_DAO) Discord (https://discord.gg/qfrd2wtmmB) Telegram (https://t.me/+hYHJpANjigI4NWVl) Docs (https://docs.cannabis-dao.xyz/) , Medium (https://medium.com/@cann_dao) Cerebrum DAO (https://cerebrumdao.com/) Investing in solutions & cures to Alzheimer’s disease and advancing brain wellness and longevity. Twitter (https://twitter.com/Cerebrum_DAO) Discord (https://discord.com/invite/pSAbaHf7Rf) - - CrunchDAO (https://www.crunchdao.com/) Solving complex problems through collective intelligence in quantitative finance. Twitter (https://twitter.com/CrunchDAO) - - DeSci Domain (https://desci.crunchdao.com/) , White Paper (https://gateway.crunchdao.com/ipfs/QmZMqphoBT3dwzvMg7pgyvQPhqj1oCs2x7eWU1jXLZvDi6) CureDAO (https://www.curedao.org/) Minimizing suffering by accelerating clinical discovery through a community-owned rewards system. Twitter Discord (https://curedao.org/discord?utm_source=desci-wiki) - Whitepaper (https://docs.curedao.org/?utm_source=desci-wiki) , App (https://app.curedao.org/?utm_source=desci-wiki) , Studies (https://studies.curedao.org/?utm_source=desci-wiki) DeSciWorldDAO (https://desci.world/) Connecting decentralized and scientific communities to further the mission of Decentralizing Science. Twitter (https://twitter.com/DeSciWorld) Discord (https://discord.gg/jnEUqVH8xv) - - FrontierDAO (https://www.frontierdao.xyz/) Incubating innovation in fusion energy, space exploration, and climate solutions. Twitter (https://twitter.com/FrontierDAO) Discord (https://discord.com/invite/UQVVw8QWrV) - Whitepaper (https://www.frontierdao.xyz/_files/ugd/6a58eb_6b91ad8ddd7a41739c58643df4f486c2.pdf?index=true) , NFT Collection (https://opensea.io/frontierdao) Future Foods DAO (https://www.futurefoodsdao.com/) Funding early-stage alternative protein open-source research. - - - - GenomesDAO (https://genomes.io/) Focusing on the security of genomic data and quality of research workflow. Twitter (https://twitter.com/GenomesDAO) Discord (https://discord.com/invite/3DaD2na4XJ) - DAO Docs (https://genomes.gitbook.io/genomes.io-docs/) GenomicDAO (https://genomicdao.com/) Advancing Precision Medicine for under-representative populations using AI. Twitter (https://twitter.com/genomicdao) - - Whitepaper (https://docs.genomicdao.com/genomicdao-whitepaper/) HairDAO (https://www.hairdao.xyz/) Searching for a cure for hair loss. Twitter (https://twitter.com/HairDAO_) Discord (https://discord.com/invite/nEKD4qBPSU) - Whitepaper (https://uploads-ssl.webflow.com/61a12cfcc8291d38310ab2a4/61fb7b4f82f61e246e1d1f10_HairDAO%20White%20Paper.pdf) Immortality (https://imt.cx/) Cryptocurrency focused on age prevention and reversal. - - - Whitepaper (https://imt.cx/assets/pdf/whitepaper.pdf) Jocelyn DAO (https://jocelyn.gosh.sh/) Offering free tools for research code reproducibility and safety. Twitter (https://twitter.com/phs_dao) - Telegram (https://t.me/desci_gosh) LinkedIn (https://www.linkedin.com/company/91598780) LabDAO (https://www.labdao.com/) An open, community-run network of laboratories with a CRO/Marketplace focus. Twitter (https://twitter.com/lab_dao) - - Resources (https://arye.substack.com/p/building-a-labdao-for-web3-biotech) , UltraRare Podcast (https://rss.com/podcasts/ultrarare/385698/) Open Science DAO (https://opsci.io/) A community-owned ecosystem revolutionizing collaboration and democratizing funding. Twitter - - News (https://pulse.opsci.io/) PsyDAO (https://www.psydao.io/) Funding research at the intersection of psychedelics and mental health. Twitter (https://twitter.com/psy_dao?s=20) Discord (https://discord.gg/hUH4MWxVFx) Telegram (https://t.me/+KvFGR2sVlOg2NTM0) Video (https://youtu.be/QInIDUDf_YQ?t=3797) Reputable DAO (https://www.reputable.health/) A biohacking and personalized wellness community. Twitter Discord (https://discord.com/invite/bXaBCt4ueb) - Whitepaper (https://docs.reputable.health/product-docs/) Research Collective (http://researchcollective.io/) Performing decentralized trials and ‘adversarial research’. - - Telegram (https://t.me/+hDgQ1cZ_jZQ3MDgx) - ValleyDAO (http://valleydao.bio/) Supporting the transition to a sustainable bioeconomy by funding democratically elected academic research. Twitter (https://twitter.com/valley_dao) Discord (https://discord.gg/APD46yZD7E) - Announcement Article (http://valleydao.medium.com/5bc470a58482) VitaDAO (http://vitadao.com/) Funding longevity research with a mission to extend human life and healthspan. - - - Links (https://vitadao.com/links) , Governance Proposals (https://gov.vitadao.com/) , YouTube (https://www.youtube.com/c/vitadao) , Podcasts (https://anchor.fm/vitadao) , Articles (https://vitadao.medium.com/) , Newsletter (http://newsletters.vitadao.com/) Resources DeSci Berlin DeSci.Berlin (http://DeSci.Berlin) 2022 - All Talks (https://www.youtube.com/playlist?list=PLYCWARA8YNdpVj31TutmnxptlK8Wy7O6D) DeSci.Berlin (http://DeSci.Berlin) 2023 (https://www.youtube.com/playlist?list=PLYCWARA8YNdpZkxs-3a8f2VSUTp6EAlUu) DeSci London DeSci London (https://www.youtube.com/@descilondon/videos) Schelling Point 2022 DeSci @ Schelling Point Amsterdam 2022 (https://www.youtube.com/playlist?list=PLbOp_NX_vgZu251r8y-ZdZmhaeVCVgq7R) Articles A Guide to DeSci, the Latest Web3 Movement - a16z crypto (https://future.a16z.com/what-is-decentralized-science-aka-desci/) What to know about the new DeSci movement (https://future.a16z.com/what-is-decentralized-science-aka-desci/) odysee.com (http://odysee.com) odysee.com (http://odysee.com) With cryptocurrency and NFTs, ‘decentralized science’ seeks to upend drug industry financing (https://www.statnews.com/2022/06/23/with-cryptocurrency-and-nfts-decentralized-science-seeks-to-upend-drug-industry-financing/) With cryptocurrency and NFTs, "decentralized science" seeks to upend drug industry financing. (https://www.statnews.com/2022/06/23/with-cryptocurrency-and-nfts-decentralized-science-seeks-to-upend-drug-industry-financing/) Decentralized science platform Molecule raises $13 million in seed funding (https://www.theblock.co/post/151539/decentralized-science-platform-molecule-raises-13-million-in-seed-funding) Molecule has raised $13 million in seed funding in a round led by venture capital firm Northpond Ventures. (https://www.theblock.co/post/151539/decentralized-science-platform-molecule-raises-13-million-in-seed-funding) How to DAO with People, not just Protocols (https://mikemccoy.substack.com/p/how-to-dao-with-people-not-just-protocols) The concepts and tools to understand on how to build decentralized online communities and what it's all about (https://mikemccoy.substack.com/p/how-to-dao-with-people-not-just-protocols) Non fungible tokens (NFTs) for academic publications? (https://gencore.bio.nyu.edu/nfts-for-academic-publications/) Being rejected from the preprint server, bioRxiv, seemed like a new low for me. But, I was heartened to know that I was in good company. At least an explanation was provided by the bioRxiv team t (https://gencore.bio.nyu.edu/nfts-for-academic-publications/) How scientists are embracing NFTs (https://www.nature.com/articles/d41586-021-01642-3) Nature - Is a trend of auctioning non-fungible tokens based on scientific data a fascinating art fad, an environmental disaster or the future of monetized genomics? (https://www.nature.com/articles/d41586-021-01642-3) Epoch 16 - Decentralized Science - DeSci (https://bowtiedbiotech.substack.com/p/epoch-16-decentralized-science-desci?s=r) Pioneering Future Business Models for Biotech Financing & Governance (https://bowtiedbiotech.substack.com/p/epoch-16-decentralized-science-desci?s=r) DeSci: Can crypto improve scientific research? (https://cointelegraph.com/magazine/2022/04/15/desci-tokens-help-improve-scientific-research) DeSci is a new movement of citizen scientists, open-access scientific research and crowd-sourced peer review funded by crypto that’s gathering pace in 2022. (https://cointelegraph.com/magazine/2022/04/15/desci-tokens-help-improve-scientific-research) A DeSci Origin Story (https://medium.com/coinmonks/a-desci-origin-story-b6b234f7b1a3) How EncrypGen coined a term and helped start a movement (https://medium.com/coinmonks/a-desci-origin-story-b6b234f7b1a3) DeSci: How Can DAOs Facilitate Collaboration And Push The Open S… (https://coinvise.mirror.xyz/d_jTD1q4pFNXT8JwWUAp0o9TUacsBWNHr_LnRvTmivc) Hey there! Coinvise is a web3 platform that helps creators & communities build and manage their Social Tokens. You can join the community here. If you find this essay interesting, make sure to subscribe to this newsletter and follow Coinvise on Twitter. Enjoy! 🔥 (https://coinvise.mirror.xyz/d_jTD1q4pFNXT8JwWUAp0o9TUacsBWNHr_LnRvTmivc) How Do Decentralized Science (DeSci) Organizations Work? (https://www.lifespan.io/topic/how-do-decentralized-science-desci-organizations-work/) DeSci organizations, such as labDAO, attempt to harness the potential of blockchain and Web3 technology to reduce the challenges faced by scientific research. (https://www.lifespan.io/topic/how-do-decentralized-science-desci-organizations-work/) AIBC Intelligence: The rise of DeSci, Blockchain and the future of Research (https://aibc.world/news/aibc-intelligence-the-rise-of-desci-blockchain-and-the-future-of-research/) DeSci, a portmanteau of Decentralised Science, describes the burgeoning field of scientific research and inquiry as enabled through... (https://aibc.world/news/aibc-intelligence-the-rise-of-desci-blockchain-and-the-future-of-research/) Why Science? Why Blockchain? Why A DAO Now? (https://mirror.xyz/0x4A35674727c44cf4375d80C6171281Ba2f764213/oQ8iKsOm3Hb05pUA3oV_GFyiyqcmJJk6cFnwudXsbP8) By Paige Donner (https://mirror.xyz/0x4A35674727c44cf4375d80C6171281Ba2f764213/oQ8iKsOm3Hb05pUA3oV_GFyiyqcmJJk6cFnwudXsbP8) ## Delegated Proof of Stake (DPoS) URL: https://radix.wiki/contents/tech/core-concepts/delegated-proof-of-stake-dpos Updated: 2026-02-03 Delegated Proof of Stake (DPoS) is a Sybil protection mechanism invented by Dan Larimer (https://moreequalanimals.com/about/) and used by certain distributed ledgers, including Radix. In DPoS, holders can delegate their network tokens to validators in exchange for a share of the staking rewards. Usually, inclusion in the approved validator set is conditional upon token holdings. Staking Staking, in the context of distributed ledgers, involves participants committing (or ‘staking’) a designated amount of the network's native tokens to support operations like transaction validation and consensus maintenance. This process is central to Proof of Stake (PoS) consensus algorithms, a contemporary alternative to traditional Proof of Work (PoW) systems. Benefits - Validators receive rewards for staking, typically in the form of new coins or transaction fees. These are often termed ‘staking rewards.’ - Staking enhances blockchain security, decentralization, and energy efficiency. Challenges - Possibility of network control centralization amongst significant stakeholders. - Risks of losing staked coins due to network attacks or validator misconduct. - The network's security is reliant on a high Nakamoto Coefficient (the smallest number of nodes that could disrupt the network). Radix and Staking Radix uses a specialized staking mechanism to fortify its network's integrity and security. Staking Mechanisms - Radix mandates nodes to have a meaningful stake to deter malicious actors from gaining undue influence over the network. - Staking rewards on Radix are distributed in proportion to the staked amount, encouraging network security. DPoS (Delegated Proof of Stake) on Radix - DPoS in Radix appoints validators responsible for adding new blocks and maintaining transactional integrity. - Users can delegate tokens to particular validators, who subsequently share block rewards with them. - Delegation on Radix involves "locking" tokens in trust of certain node operators, signaling trust rather than a physical transfer. - Unstaking is subject to a delay of 500 epochs (https://learn.radixdlt.com/article/how-to-stake-and-unstake-xrd) , which equates to approximately 10-15 days. Staking Components & Validator Interaction - In Radix's new staking model, users can stake and unstake $XRD tokens to gain emission rewards. - Upcoming dApps will interact with "validator components" via method function calls, akin to other components. Understanding Staking Rewards - Staking rewards in Radix are not listed as individual transactions every epoch. For verification, one has to contrast the staked amount initially and presently. Detailed information is accessible on radixscan.io (http://radixscan.io) . Detailed Staking Process on Radix Eligibility and Procedure: - Any $XRD token holder can stake and gain emission rewards. - Users stake tokens through the Radix Desktop Wallet, with a list of available validators on the Radix Explorer. - After initiating staking, from the subsequent epoch, stakers earn emission rewards, which are then automatically re-staked. - There's a 10-14 days delay for unstaking, serving as a security measure to detect and handle any malicious node activity. During this period, no rewards are earned. Understanding $XRD Emissions: - Emissions pertain to the creation of new $XRD tokens as rewards, occurring at the conclusion of fixed intervals termed epochs. - Approximately 300 million $XRD tokens are minted as rewards each year. - Emissions can be penalized if validators are lax in participating in consensus. This ensures stakers select efficient nodes and nodes maintain their efficiency. - Each epoch's onset involves checking the total $XRD staked to every validator, and based on this, 100 validators are chosen. Minted tokens at the epoch's end are distributed based on this list. - Individual $XRD Staker Emissions: Stakers receive a percentage of total emissions proportionate to their stake, reduced if the node they're staked to has penalties or fees. - Example Calculation: Emission×Stake Percentage×(100−Penalty)×(100−Fee)Emission×Stake Percentage×(100−Penalty)×(100−Fee) - Validator Node-runner Emissions: Validators receive rewards based on their fee percentage and the total stake delegated to their node. - Example Calculation: Emission×Node Stake Percentage×Fee Percentage×(100−Penalty)Emission×Node Stake Percentage×Fee Percentage×(100−Penalty) Earnings as a Validator: - Potential validators must register their nodes, but only the top 100 nodes by staked amount are selected as validators. - Validators not only earn from staking but also through specific validator fees. Special Incentives: Radix Tokens Jersey Limited (RTJL) offers a distinct "subsidy" for validator node-runners, which is sourced from a 600m $XRD token reserve set up at the network's inception. This is separate from the standard protocol. Potential Risks Delegating staking to validators isn't devoid of risks. The introduction of slashing mechanisms is expected to mitigate the risk of dishonest validators. Caution is vital, and the Radix team is actively working on mechanisms to prevent issues like front-running. ## Radix Booster Grants URL: https://radix.wiki/contents/resources/radix-booster-grants Updated: 2026-01-28 The Radix Booster Grants program is a funding initiative designed to support and accelerate the development of decentralized applications (dApps) within the Radix ecosystem (/ecosystem) . Launched as part of the broader 250M $XRD ($10M+) Ecosystem Fund (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) , this program aims to foster innovation and growth in the Radix network by providing financial support to developers and entrepreneurs. The Radix Booster Grants offer up to $160,000 in $XRD tokens to eligible projects, distributed across four consecutive tracks (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : MVP Booster, Launch Booster, Refine Booster, and Growth Booster. Each track is designed to support projects at different stages of development, from initial concept to market growth. Key features of the program include: - Financial support ranging from $5,000 to $140,000 per track. - Networking opportunities with grant alumni and the Radix community. - Co-marketing opportunities. - Access to partner offers such as AWS credits and Hacken audits. - Exclusive dApp features via the "Runs on Radix" campaign. The Radix Booster Grants have supported a diverse range of projects, including decentralized exchanges (https://www.radixdlt.com/blog/babylon-booster-grants-ociswap) , NFT marketplaces (https://www.radixdlt.com/blog/babylon-booster-grants-radland) , gaming platforms (https://www.radixdlt.com/blog/babylon-booster-grants-infinitelabs) , and infrastructure services (https://www.radixdlt.com/blog/babylon-booster-grants-xrd-domains) . These grants play a crucial role in expanding the Radix ecosystem and driving adoption of the Radix network's unique technologies. Background Purpose and Goals of the Grant Program The Radix Booster Grants program was established as part of a broader initiative to support and empower ambitious founders (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) in building decentralized applications that will power the next wave of Web3 on Radix. The primary objectives of the program include: - Accelerating ecosystem growth: By providing financial support to promising projects, the program aims to catalyze the development of a diverse range of dApps on the Radix network. - Fostering innovation: The grants encourage developers to leverage Radix's unique technological features to create novel solutions in areas such as DeFi, NFTs, gaming, and infrastructure services. - Attracting talent: By offering substantial funding and support, the program seeks to attract skilled developers and entrepreneurs to the Radix ecosystem. - Improving user experience: Many of the funded projects focus on creating user-friendly interfaces and tools, aiming to make Web3 technologies more accessible to mainstream users. - Enhancing ecosystem infrastructure: Some grants are allocated to projects developing essential tools and services that other dApps can build upon, strengthening the overall Radix ecosystem. - Promoting community-driven development (https://www.radixdlt.com/blog/babylon-booster-grants-ociswap) : The program encourages projects to actively seek community feedback and collaborate with other ecosystem participants. By supporting projects at various stages of development, from concept to market growth, the Radix Booster Grants program plays a crucial role in realizing Radix's vision of a more accessible, efficient, and secure decentralized future. Grant Structure The Radix Booster Grants program is structured into four consecutive tracks (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) , each designed to support projects at different stages of development. Teams can receive each grant once, with a total funding of up to $160,000 available per project across all tracks. 3.1 MVP Booster ($5,000) The MVP Booster is the initial grant track, aimed at projects in their earliest stages of development. Key features include: - Funding amount (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : $5,000 in $XRD - Purpose: To support the development of a Minimum Viable Product (MVP) - Eligibility: Open to projects building their first MVP either on Radix Stokenet (testnet) or as a beta version on Radix mainnet - Goal: To help projects validate their concept and prepare for the next stage of development 3.2 Launch Booster ($15,000) The Launch Booster is the second track, designed to support projects ready to deploy on the Radix mainnet. Key features include: - Funding amount (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : $15,000 in $XRD - Purpose: To facilitate the launch of the product on Babylon mainnet - Eligibility: Projects that have completed their MVP and are ready for mainnet deployment - Goal: To assist projects in transitioning from testing to live operations on the Radix network 3.3 Refine Booster (up to $50,000) The Refine Booster is an optional track for projects seeking to improve their existing dApp. Key features include: - Funding amount (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : Up to $50,000 in $XRD - Purpose: To support improvements in the dApp and increase traction - Eligibility: Projects that have launched on mainnet and are looking to enhance their product - Goal: To help projects refine their offering and prepare for significant growth 3.4 Growth Booster (up to $140,000) The Growth Booster is the final and largest grant track, aimed at supporting projects to reach their seed stage. Key features include: - Funding amount (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : Up to $140,000 in $XRD - Purpose: To support projects in achieving significant growth and reaching seed stage - Eligibility: Projects that demonstrate strong potential for growth and have a clear plan for achieving it - Application process: Includes pitching to the RDX Works Ecosystem investment team - Requirements: Teams need to present a comprehensive plan for product development, business strategy, and marketing, along with a detailed budget Additional Support In addition to the main grant tracks, the program offers: - Audit Grants (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : Additional funding to cover some of the costs of code audits - Milestone rewards (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) : Rewards for achieving specific milestones - Fundraising rewards (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) : Up to 10% of raised funds (capped at $50k) for projects that successfully raise external funding This structured approach allows the Radix Booster Grants program to support projects throughout their development lifecycle, from initial concept to market growth and beyond. Application Process The Radix Booster Grants program has a structured application process designed to identify and support promising projects within the Radix ecosystem. This section outlines the eligibility criteria, application steps, and review process. Eligibility Criteria To be eligible for a Radix Booster Grant, applicants must meet the following criteria: - Building a dApp on Radix (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : The project must be developed for the Radix network. - Stage-appropriate (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : Applicants should apply for the grant track that matches their project's current development stage. - Geographic restrictions (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : While the program is open to projects globally, some countries may be restricted. Applicants are advised to check the list of restricted countries. - KYC/KYB compliance (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : All individuals and teams must pass Know Your Customer (KYC) or Know Your Business (KYB) checks to receive the grant. Application Steps The application process typically follows these steps: - Initial Application (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : - Fill out the application form, which takes about 30-60 minutes to complete. - Include a brief 1-minute video introducing the team. - Interview (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : - If the initial application is approved, applicants are invited to a 10-minute interview call to discuss their project in more detail. - Project Scope and Agreement (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : - Work with the Radix team to define the project scope and set deadlines. - Agree on the specific milestones that need to be achieved to claim the grant. - KYC/KYB Process (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : - Complete the necessary Know Your Customer or Know Your Business checks. - Grant Claiming (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : - Upon reaching the agreed milestones or deadline, submit a claim for the grant. - The Radix ecosystem team will review and test the dApp to ensure the grant scope has been fulfilled. - Payment and Promotion (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : - Once approved, the grant payment process begins. - The Radix team also initiates promotional activities for the project. Review and Selection Process The review and selection process involves the following: - Continuous Review (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : Applications are reviewed on an ongoing basis. - Evaluation Criteria (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) : Projects are assessed based on their potential impact on the Radix ecosystem, innovation, team capability, and alignment with Radix's technological advantages. - Feedback Loop (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : If a project is not selected, applicants are encouraged to refine their project and apply again when ready. - Milestone-based Disbursement (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) : Grants are distributed based on achieving agreed-upon milestones, ensuring accountability and progress. The application process is designed to be thorough yet flexible, allowing for adjustments if projects face unexpected challenges or delays. The Radix team maintains open communication with applicants throughout the process to provide support and guidance. Funded Projects The Radix Booster Grants program has supported a diverse range of projects, contributing to the growth and development of the Radix ecosystem. These projects span various categories including DeFi, NFTs, gaming, and infrastructure. Below is a list of some of the key projects that have received funding: Project Name Category Grant Amount Key Features CaviarNine (/ecosystem/caviarnine) DeFi $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-carviarnine) - Concentrated liquidity pools - Decentralized order book - Liquid staking pool - Swap Widget Ociswap (/ecosystem/ociswap) DeFi $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-ociswap) - Concentrated liquidity - SPLASH 2.0 program - Dynamic fee settings Weft Finance (/ecosystem/weft-finance) DeFi $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-weft-finance) - Decentralized lending and borrowing - 'Wefties' NFTs for loans - Flexible collateral ratios DefiPlaza (/ecosystem/defiplaza) DeFi $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-defiplaza) - Multi-token trading platform - Single-sided liquidity provision - CALM model for impermanent loss Fibonacci Finance (/ecosystem/fibonacci-finance) DeFi $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-fibonacci-finance) - Financial risk services - Custom risk engine designs - On-chain liquidity management RadLand (/ecosystem/radland) NFT/Gaming $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-radland) - NFT marketplace - Bulk minting capabilities - Support for Olympia network NFTs Impahla (/ecosystem/impahla) NFT/Gaming $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-imphala) - NFT creation and management - IPFS integration - Market analytics InfiniteLabs (/ecosystem/infinite-labs) NFT/Gaming $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-infinitelabs) - 3D game with NFT integration - Multiple game modes - In-game NFT marketplace XRD Domains (/ecosystem/xrd-domains) Infrastructure $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants) - Domain management - Namelets for record management - Genus domains RadixCharts (/ecosystem/radixcharts) Infrastructure $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-radixcharts) - Network analytics - Real-time data on various metrics - Ecosystem performance comparison ShardSpace (/ecosystem/shardspace) Infrastructure $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-shardspace) - User interface for dApp interaction - Portfolio management tools - Explorer pages for network data Trove (/ecosystem/trove) Other $15,000 (https://www.radixdlt.com/blog/babylon-booster-trove) - Peer-to-peer trading tool - Open Swap feature - Complex asset combination support Gable (/ecosystem/gable-finance) Other $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-gable) - Income generation with LSU tokens - Flash Loans - Automated concentrated liquidity RadLock (/ecosystem/radlock) Other $15,000 (https://www.radixdlt.com/blog/babylon-booster-grant-radlock) - Project asset-locking solution - Liquidity locking - Team vesting for token distribution Backeum (/ecosystem/backeum) Other $15,000 (https://www.radixdlt.com/blog/babylon-booster-grants-backeum) - Content creator monetization - NFT rewards for supporters - Cryptocurrency integration Root Finance DeFi $5,000 (MVP Booster) - Decentralized lending and borrowing - Multi-asset support - Health Bar tool Selfi Social Social Media $5,000 (MVP Booster) - Chrome extension for X (Twitter) - Web3 integration in social media - Social DeFi interactions NFT Wars NFT/Gaming $5,000 (MVP Booster) - Hero-centric idle mobile RPG - NFT integration in gameplay - Multiple game modes Impact and Outcomes The Radix Booster Grants program has played a significant role in fostering the growth and development of the Radix ecosystem. This section highlights key metrics, achievements, and ecosystem growth statistics that demonstrate the program's impact. Key Metrics and Achievements - Grant Distribution: - Over half a million USD distributed (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) to builders in the Radix community since the launch of the first grants program in February 2023. - Allocation of $170k in Booster Grants (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) to various projects. - Distribution of $90k in Milestone Rewards (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) to encourage project progress. - Over $20k in developer incentives (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) , including Scrypto challenges. - Project Diversity: - Support for projects across various categories (https://www.radixdlt.com/blog/babylon-booster-grants-carviarnine) , including DeFi, NFTs, gaming, and infrastructure services. - Funding provided to over 20 teams enrolled in the Radix Booster grants (https://www.radixdlt.com/blog/mvp-booster-winners-root-finance-selfi-social-and-nft-wars) program. - Developer Engagement: - More than 100 applications received (https://www.radixdlt.com/blog/mvp-booster-winners-root-finance-selfi-social-and-nft-wars) for the Radix Booster Grants. - Increasing participation in developer incentives and Scrypto challenges. Ecosystem Growth Statistics - Total Value Locked (TVL): - CaviarNine reported over $4 million in TVL (https://www.radixdlt.com/blog/babylon-booster-grants-carviarnine) as of November 2023. - Ociswap contributed to pushing Radix into the top 100 by TVL on DeFi Llama (https://www.radixdlt.com/blog/babylon-booster-grants-ociswap) , with over $2 million in TVL. - User Adoption: - Weft Finance reported over 300 active users (https://www.radixdlt.com/blog/babylon-booster-grants-weft-finance) , with more than 20 million $XRD staked through the platform. - Trove facilitated over 200 swap offers (https://www.radixdlt.com/blog/babylon-booster-trove) and attracted more than 400 unique users. - NFT Ecosystem Growth: - Trove reported the swapping of over 90 NFTs (https://www.radixdlt.com/blog/babylon-booster-trove) within its platform. - Impahla attracted 77 creators and 48 collections (https://www.radixdlt.com/blog/babylon-booster-grants-imphala) in its first two days of operation. - Infrastructure Development: - XRD Domains reported over 500 domains minted (https://www.radixdlt.com/blog/babylon-booster-grants) by Stokenet testers in the first week of their testnet going live. - ShardSpace aimed to attract 500 unique users per month by the end of 2023 (https://www.radixdlt.com/blog/babylon-booster-grants-shardspace) , with plans to double this to 1000 by Q1 2024. - DeFi Innovation: - Gable Finance facilitated the borrowing of 1 million $XRD (https://www.radixdlt.com/blog/babylon-booster-grants-gable) , showcasing the platform's capability in handling significant transaction volumes. - DefiPlaza set a target of amassing at least 10 million USD in liquidity (https://www.radixdlt.com/blog/babylon-booster-grants-defiplaza) , a goal they had already surpassed on Ethereum. These metrics and achievements demonstrate the significant impact of the Radix Booster Grants program on the growth and diversification of the Radix ecosystem. The program has not only provided financial support but has also catalyzed innovation, user adoption, and the development of critical infrastructure within the Radix network. Challenges and Future Developments The Radix Booster Grants program, while successful in many aspects, has faced challenges and identified areas for improvement. This section outlines some of the lessons learned and planned developments for the future of the program. Lessons Learned - Diverse Project Needs: - The program has recognized that different projects require varying levels of support (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) at different stages of development. - This realization led to the creation of multiple grant tracks (MVP, Launch, Refine, and Growth) to better cater to projects at various stages. - Importance of Community Feedback: - Projects like Impahla demonstrated the value of incorporating community feedback (https://www.radixdlt.com/blog/babylon-booster-grants-imphala) early in the development process. - This has emphasized the need for grant recipients to engage closely with the Radix community throughout their development journey. - Balancing Innovation and Practicality: - The program has learned to balance support for highly innovative projects with those addressing immediate ecosystem needs. - This balance is crucial for both pushing technological boundaries and building a robust, usable ecosystem. - Regulatory Considerations: - As the crypto landscape evolves, there's an increasing need for projects to consider regulatory compliance (https://www.radixdlt.com/blog/babylon-booster-grants-backeum) . - The program has recognized the importance of supporting projects in navigating these regulatory challenges. Planned Improvements - Expanded Funding Tracks: - The program is considering introducing new funding tracks (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) to address specific ecosystem needs, such as infrastructure development or interoperability solutions. - Enhanced Mentorship: - Plans are underway to provide more comprehensive mentorship (https://www.radixdlt.com/blog/introducing-radix-foundry-program) to grant recipients, leveraging the expertise of successful projects within the ecosystem. - Collaboration Initiatives: - The program aims to foster more collaboration between grant recipients, potentially through joint hackathons or collaborative project opportunities. - Streamlined Application Process: - Based on feedback, there are plans to further streamline the application and review process (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) to reduce turnaround times and provide quicker feedback to applicants. - Focused Ecosystem Development: - The program is identifying key areas within the ecosystem that require development (https://www.radixdlt.com/blog/introducing-radix-foundry-program) , such as lending platforms, real-world asset tokenization, and advanced NFT functionalities. - Future grant rounds may prioritize projects addressing these specific needs. - Increased Support for Post-Launch Growth: - Recognizing that many projects face challenges after initial launch, the program is exploring ways to provide continued support for marketing, user acquisition, and scaling (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) . - Integration with Radix Foundry: - The Radix Foundry program (/contents/resources/radix-foundry-program) , which offers up to $250,000 in funding for high-potential projects, is being more closely integrated with the Booster Grants program to provide a clear growth path for successful projects. - Community Governance: - There are discussions about introducing elements of community governance in the grant selection process, allowing the Radix community to have a say in which projects receive funding. These planned improvements aim to enhance the effectiveness of the Radix Booster Grants program, ensuring it continues to drive innovation and growth within the Radix ecosystem while adapting to the evolving needs of projects and the broader crypto landscape. Page to be updated soon ## Composability URL: https://radix.wiki/contents/tech/core-concepts/composability Updated: 2026-01-28 Composability is the general ability of components of a system to be recombined into larger structures and for the output of one to be the input of another. Within crypto, composability is the ability of decentralized applications (dApps) and DAOs to effectively clone and integrate one another (syntactic composability), and for software components such as tokens and messages to be interoperable between them (morphological composability). Syntactic Composability Radix already achieves syntactic composability pretty well: every smart contract on the protocol is public and can be called by any other, which means that any software logic only has to be worked out once before it is available for reuse by the entire ecosystem. In practice, this means that any Radix dApp can use Uniswap's contracts to manage token exchanges, or organizations can use Aragon's Client contract for on-chain governance. Being able to reuse open-source components is the superpower that makes building in Web3 extremely efficient. Teams can use huge amounts of existing, reliable code and focus only on building the components that are missing for their project. This exponentially increases the rate of experimentation and innovation. Not having to reinvent the wheel (or worry about being sued by regulators and patent trolls) every time you build a business makes Web3 magnitudes more efficient at allocating resources than Web2. It also invokes the magic of compounding: Atomic Composability Importantly, Radix allows for 'atomic' composability where several operations across multiple dApps can be bundled into a single transaction and executed together. If one of the operations fails, then the entire transaction will fail. This makes it possible to split a transaction across multiple exchanges, or vote on several DAO proposals at once without the risk of a partial failure. Atomic composability is vital to decentralized finance (DeFi) because it allows for innovations like 'flash loans' where an asset is borrowed, invested, and paid back within a single transaction. Morphological Composability Although Radix's architecture has been built to facilitate composability, this doesn't guarantee that the internal morphology of dApps, such as functions and interfaces, are automatically compatible with every other. This requires collaboration. To this end, a number of application-level standards have been agreed for elements such as tokens, name registries and wallet formats, known as Radix Requests for Comment (ERC). The most famous of these is ERC20, which defines the characteristics of a fungible token within Radix. From a Web2 perspective, the implications of morphological composability are mind-blowing: it is already possible for (take a breath...) DAO token holders to vote on Snapshot and use Zodiac Reality to trigger a transaction from the DAO's treasuryto take a $DAI loan from MakerDAO, pool the $DAI on Curve, then deposit the resulting LP tokens into Convex to earn trading fees plus $CRV and $CVX tokens. Composability like this is possible because of the interoperability of tools and the standardized definition of a token. Web3 Composability: Vote on Snapshot Web3 Composability: Vote on Snapshot → → Zodiac Reality Zodiac Reality → → Borrow $DAI @ MakerDAO Borrow $DAI @ MakerDAO → → Pool $DAI @ Pool $DAI @ Curve Curve → → Deposit Curve LP tokens @ Convex Deposit Curve LP tokens @ Convex → → Earn trading fees + $CRV + $CRX Earn trading fees + $CRV + $CRX Beyond finance, characters or chattels from Web3 games like Axie Infinity or Guild of Guardians are instances of non-fungible tokens (NFTs) - unique digital property, standardized in ERC721. Because they are actually owned by users, they can be transferred freely between different games, sold on secondary markets, or even used as collateral for loans. To achieve the equivalent in Web2 would mean somehow convincing Nintendo to share a database with Sony and Microsoft or, back in the finance world, for eTrade to execute transactions on Robinhood. The likelihood of this happening is basically zero: even if Web2 companies weren't all in competition with each other, they are built on incompatible tech stacks that would prevent them from reusing each others' software. As well as digital possessions, Web3 also allows users to port their identity and reputation between dApps. Instead of signing in with a username and password, users use their Web3 wallet to give selective read-only access to their ERC20 address, which acts as a unique identifier and reputation metric. Having a verifiable 'resume' of token purchases, dApp interactions, and DAO membership has given rise to the 'Degen(erate) score' as a tongue-in-cheek ranking of Web3 literacy, but which is already being included by sincere applicants in Web3 job applications. In this way, identity and reputation can be seen as currencies, backed by Web3 activity. Like any currency, on-chain identities can be swapped and traded (with all of the confusion that entails), as well as forming the components of an abstracted social network that will form digital communities, working groups and new kinds of 'nations'. DAO Morphology At present, there isn't an ERC to standardize DAO structures, functions, and interfaces, but the industry is moving in that direction. DAOstar One is a roundtable of key organizations working towards standardizing the definition and minimal parameters of a DAO. The group's focus is currently on experimentation - so not to limit innovation - but once implemented, an ERC would bring Lego- composability to the DAO ecosystem, meaning that Aragon's Finance dApp could be used with Moloch v2, or Openzeppelin governance contracts could integrate with Gnosis Safe as well as Compound's Governor contracts. Even before an ERC has been agreed upon, Gnosis are developing a modular system of DAO tools built on their Zodiac Open Standard. Any DAO platform that implements their IAvatar interface will have access to a growing number of Zodiac-compliant tools such as an oracle module that can trigger on-chain executions or a bridging module that enables cross-chain control of a Gnosis Safe. Other developments such as DAOhaus' Boost Foundry (itself built on Moloch v2) are part of a growing trend towards dAppstores for DAOs with plugins for DeFi protocols and additional DAO functionality such as automated payment streams using Superfluid or gated content using Mintgate. Apart from these examples, Web3 software may be open-source and forkable, but that is not in itself a guarantee of compatibility with other projects. The hope is though, that because the switching costs for users are so low and the benefits of interoperability are so high, the dynamics and desire are in place for the industry to align on a shared set of standards as soon as possible. This last point is worth underlining. Web2 companies are notorious for building competitive 'moats' to isolate themselves from competition. These can take the form of favorable regulation, aggressive patent enforcement, high switching costs, data hoarding and many other self-preservation strategies. However, none of these strategies make sense in the open-source, permissionless, transparent world of Web3. Instead, organizations have to build on the assumption that their code will be reused by others and turn this into the vanguard of their growth strategy; composability converts plagiarism from a tail-risk in Web2 into a core assumption and strategic objective in Web3. In such an environment, organizations must become experts at collaboration and symbiosis, which is the reason why initiatives like DAOstar One and the DAO Global Hackathon have attracted so many participants in pursuit of DAO composability. DAO Tooling It is standard practice in Web2 for companies to use each other's software for things like desktop publishing and graphic design. DAOs are similar in that they use proprietary software like Discord to manage their communities. Furthermore, Web3 infrastructure enables such tools to be integrated to such an extent that they become components of DAO governance, able to schedule and execute transactions. This is made possible by third-party networks called oracles, which are capable of verifying external data and embedding it in on-chain smart contracts. It is already possible to trigger on-chain executions on Aragon from emoji votes on Discord using Witnet's decentralized oracle network, and as mentioned, Gnosis Zodiac's Reality module integrates reality.eth as an oracle to trigger Safe transactions in response to Snapshot proposals, Discord polls, or any other compatible data. For a deeper look at DAO tooling, check out Incentive Design and Tooling for DAOs and Organizational Legos: The State of DAO Tooling. The integration of real-world data into smart contracts opens up a world of data- composability in Web3 that is not possible to cover in this article but is explored in First-Party Oracles with API3 and 77 Smart Contract Use Cases Enabled by Chainlink. Web3 is Learning to Talk There are very few domains where a monolithic architecture makes sense: supra- national political structures tend to fracture over time; the internet's success is due to many nodes communicating via common protocols; and even the human brain is a modular system. Over time, computing architecture has evolved from monolithic stacks into a diffusion of microservices that can be composed together, upgraded and swapped out as required. Although Web2 is built on such a backend, composable design is not reflected in Web2 companies themselves: effective monopolies like Amazon, Facebook and Google have little incentive to break themselves up into composable microservices. In the end though, this may be forced upon them by Web3: the logic of composability is destiny if the testimonies of technology and nature are to be believed. Before this destiny is reached, there are a couple of roadblocks along the way. The first is a monolithic, Web2 mindset. It takes conviction and shareholder votes to adopt a paradigm that relegates one's own importance to a mere component. Yet, there are signs that composability has enough traction in Web3 to sell its benefits to the wider tech industry. We have certainly seen this in the pace of adoption and metrics such as Assets Under Management (AUM) for various protocols. Web2 will not want to forfeit this kind of growth for long, and in order to participate, they will have to be composable. The second roadblock is scaling. Composability works in theory, but the high fees that we have seen on Radix recently are fatal for complex operations across multiple dApps. Bridging between Radix and newer chains breaks atomic composability and a proliferation of new chains will dilute security and liquidity resources until the industry settles on a viable solution. This is an enormous topic that will be covered in greater depth in a forthcoming article. It is no accident that syntax and morphology are concepts lifted from the domain of linguistics. They apply here because software is a language and Web3 - being an expression of software - fits into a linguistic framework. The next level of analysis after syntax is semantics: whether an expression makes sense; and above that is pragmatics: whether it makes sense in context. Undoubtedly, it is possible to route tokens today between dApps with no real utility or purpose. Likewise, many of the combinations being touted do not make much sense in a wider financial or social context, especially in light of Radix scaling. However, these are the first words of a new language that is infinitely more expressive than any we have seen before, and through the magic of composability, Web3 will find its voice. ## Decentralized Finance (DeFi) URL: https://radix.wiki/contents/tech/core-concepts/decentralized-finance-defi Updated: 2026-01-28 Decentralized Finance, commonly referred to as DeFi is an umbrella term for a variety of financial applications in blockchain or cryptocurrency geared towards disrupting traditional financial intermediaries. Overview Unlike the traditional finance sector, which operates through centralized systems governed by institutions such as banks and governments, DeFi is built on blockchain technology, ensuring operations are decentralized, transparent, and resistant to censorship. The inception of DeFi is rooted in the philosophy of eliminating the control that banks and governmental institutions have over money, financial products, and financial services. In traditional finance, these entities can influence the economy by printing more money, regulating transactions, and even denying individuals access to financial services. This centralized control not only poses a risk of censorship but also introduces a layer of trust and potential failure that DeFi aims to mitigate. DeFi operates on the premise that individuals should have unrestricted access to their funds and financial services without the need for intermediaries, based purely on transparent and immutable code. DeFi leverages smart contract networks such as Ethereum (/contents/tech/comparisons/radix-vs-ethereum) or Radix (/) , to execute financial transactions and services. These smart contracts are self-executing contracts with the terms of the agreement between buyer and seller being directly written into lines of code. This automation not only reduces the need for traditional financial intermediaries but also offers a higher degree of security and trust, as the outcomes are not controlled by any single entity and can be audited by anyone. Moreover, DeFi promises to make financial services more accessible and affordable. Traditional financial systems are known for their high fees, from the cost of borrowing to transaction fees. DeFi, on the other hand, operates with minimal fees as it removes the overhead associated with financial institutions. This democratization of finance has the potential to include unbanked populations worldwide by offering them access to a full spectrum of financial services. In essence, DeFi is not just a new set of tools and technologies but a bold reimagining of finance. It challenges the centralized financial system by offering a decentralized, transparent, and inclusive alternative. While it's still in its early stages, DeFi has already begun to show how it can transform financial services by making them more accessible, less expensive, and more equitable for everyone. Foundations of DeFi DeFi is built upon a trio of foundational technologies: cryptography, blockchain technology, and smart contracts. Each plays a crucial role in ensuring that DeFi applications are secure, transparent, and operate without the need for centralized intermediaries. Understanding these foundations is essential for grasping how DeFi works and appreciates its potential to revolutionize the financial sector. Cryptography Cryptography is the practice of secure communication in the presence of third parties. In the context of DeFi, it is used to ensure the security and privacy of transactions on the blockchain. Cryptography enables the creation of cryptographic keys, digital signatures, and encryption, providing the means for secure peer-to-peer transactions without the need for a trusted third party. It ensures that transactions are tamper-proof and that participants can transact securely and anonymously, if they choose. Blockchain Technology Blockchain technology is the backbone of DeFi. It is a distributed ledger technology that allows data to be stored across a network of computers worldwide, making it decentralized and resistant to censorship. Each block in the chain contains a number of transactions, and once a block is added to the chain, the information it contains is immutable. This ensures the integrity and transparency of financial transactions. Blockchain's decentralized nature means that it operates on a peer-to-peer network, eliminating the need for central authorities like banks and governments, thus fostering a new era of financial transactions that are open, transparent, and accessible to anyone with an internet connection. Smart Contracts Smart contracts are self-executing contracts with the terms of the agreement directly written into code. They run on blockchain networks, with Ethereum being the most prominent platform for DeFi applications. Smart contracts automatically execute transactions when predetermined conditions are met, without the need for intermediaries. This automation reduces the potential for human error and fraud, significantly lowering transaction costs and execution times. Smart contracts are the mechanism through which DeFi applications offer services such as lending, borrowing, trading, and insurance, all without the need for traditional financial institutions. Importance The combination of cryptography, blockchain technology, and smart contracts forms the technological bedrock upon which DeFi is built. These technologies together provide a secure, transparent, and efficient framework for conducting financial transactions and services. By leveraging these foundations, DeFi applications can offer a wide range of financial services that are accessible to anyone, anywhere, at any time, without the barriers and costs associated with the traditional financial system. In summary, the foundations of DeFi represent a paradigm shift in how financial services can be designed, deployed, and used. By understanding these underlying technologies, one can better appreciate the transformative potential of DeFi to create a more inclusive, efficient, and secure financial system. Stablecoins Stablecoins occupy a pivotal role in the ecosystem of DeFi, acting as a crucial bridge between the volatile cryptocurrency markets and the predictable value of traditional fiat currencies. They are designed to maintain a stable value over time, typically pegged to a specific fiat currency like the US dollar, or to other assets such as gold. This stability is essential for facilitating everyday transactions, lending, borrowing, and other financial activities within the DeFi space without the wild price fluctuations associated with typical cryptocurrencies. Definition and Mechanism Stablecoins are a type of cryptocurrency that aims to offer price stability by being pegged to a reserve asset. The most common approach is to tie the value of a stablecoin to a widely recognized fiat currency, such as the US dollar, at a 1:1 ratio. This peg is maintained through various mechanisms, including fiat-collateralized reserves, crypto-collateralized reserves, or algorithmic formulas that automatically adjust supply based on changes in demand. Examples of Stablecoins - $DAI: An Ethereum-based stablecoin that maintains its value close to one US dollar through overcollateralization with other cryptocurrencies. It is governed by the MakerDAO protocol. - Tether ($USDT): One of the first and most widely used stablecoins, Tether claims to be backed by US dollars held in reserve, offering a stable medium for transactions. - USD Coin ($USDC): A fully fiat-collateralized stablecoin, $USDC is managed by the Centre consortium, co-founded by Circle and Coinbase, providing a transparent and regulated entity that ensures every $USDC is backed by a dollar held in reserve. Importance in DeFi Stablecoins serve several critical functions in the DeFi ecosystem: - Medium of Exchange: They provide a stable and predictable medium of exchange for trading, lending, and other financial transactions, mitigating the risk of volatility associated with traditional cryptocurrencies. - Bridge to Traditional Finance: Stablecoins offer a direct connection between DeFi and the traditional financial system, allowing for the seamless transfer of value between the two worlds. - Liquidity and Accessibility: By offering a stable value, they encourage greater participation in the DeFi space, providing liquidity and making it more accessible to users unfamiliar with or wary of the volatility in the crypto market. - Enabler of Financial Services: Stablecoins enable a wide range of financial services within DeFi, including lending, borrowing, yield farming, and insurance, by providing a reliable unit of account and store of value. Use Cases and Benefits One of the key benefits of stablecoins is the facilitation of quick and inexpensive international transfers, bypassing traditional banking fees and exchange rates. They also enable users to lock in profits from trading activities without needing to convert assets back to fiat currency, maintaining their holdings within the crypto ecosystem. Furthermore, in regions with unstable currencies or restrictive financial systems, stablecoins offer a secure and stable means of preserving value. In summary, stablecoins are an essential component of the DeFi ecosystem, providing stability, security, and connectivity between the volatile world of cryptocurrencies and the more stable realm of traditional fiat currencies. Their role in enabling a broad array of financial services underscores the transformative potential of DeFi to offer accessible, efficient, and inclusive financial solutions. Borrowing and Lending Borrowing and lending represent fundamental pillars of the financial ecosystem, and in the realm of DeFi, these activities have been reimagined to leverage the transparency, security, and efficiency afforded by blockchain technology. DeFi platforms facilitate peer-to-peer lending and borrowing, eliminating traditional financial intermediaries and offering new opportunities for yield generation, liquidity, and financial management. Mechanism In DeFi, borrowing and lending are primarily facilitated through smart contracts on blockchain platforms. These contracts automate the terms and conditions of financial transactions, ensuring that they are executed exactly as agreed upon by the parties involved. This setup introduces a high degree of transparency and trust, as all transactions are recorded on the blockchain and are verifiable by anyone. Over-Collateralization A distinctive feature of DeFi lending and borrowing is the requirement for over-collateralization. Due to the volatility of cryptocurrencies and the anonymity of transactions, borrowers must usually deposit collateral exceeding the value of their loan. This mechanism protects lenders against default and market fluctuations, ensuring that the loan can be recovered even if the borrower fails to repay. Platforms and Protocols Several platforms and protocols have emerged as leaders in the DeFi borrowing and lending space, including: - Compound: A decentralized protocol that allows users to earn interest on deposits and borrow assets against collateral. - Aave: Another popular platform that offers innovative features like flash loans — instant, uncollateralized loans that must be repaid in the same transaction. - MakerDAO: Known for its stablecoin, DAI, MakerDAO enables users to generate DAI against their cryptocurrency collateral, effectively taking out a loan. Benefits - Accessibility: DeFi lending and borrowing platforms are accessible to anyone with an internet connection, regardless of geographic location or financial status. - Transparency: All transactions are recorded on the blockchain, providing a transparent record of lending terms, interest rates, and repayments. - Efficiency: Smart contracts automate the lending and borrowing processes, reducing the time and cost associated with these transactions. - Yield Opportunities: Lenders can earn passive income on their cryptocurrency holdings, often at rates higher than those offered by traditional banks. Flash Loans Flash loans are a unique and innovative DeFi feature allowing borrowers to access substantial liquidity without collateral, under the condition that the loan is repaid within the same transaction block. This mechanism enables sophisticated arbitrage, market making, and self-liquidation strategies that were previously impossible in traditional finance. Risks and Considerations While DeFi lending and borrowing offer significant advantages, they also come with risks, including smart contract vulnerabilities, market volatility, and the potential for liquidation if collateral values drop. Users must conduct thorough research and exercise caution when participating in these activities. Decentralized Exchanges (DEXes) Decentralized Exchanges (DEXes) are a cornerstone of the DeFi ecosystem, facilitating the permissionless trading of cryptocurrencies without the need for a centralized authority. Unlike traditional exchanges, which act as intermediaries between buyers and sellers, DEXes operate on blockchain technology, leveraging smart contracts to enable direct peer-to-peer transactions. Functionality and Operation DEXes utilize liquidity pools rather than traditional order books to facilitate trading. In a liquidity pool, users lock assets into a smart contract to provide market liquidity. Traders then buy and sell cryptocurrencies directly from these pools. Prices in a DEX are determined algorithmically, based on the ratio of assets in the liquidity pool, ensuring that the market remains liquid and trading can occur at any time without the need for a matching buyer or seller. Advantages of DEXes - Decentralization: By operating on a decentralized network, DEXes eliminate the risk of a single point of failure and reduce susceptibility to hacking, fraud, and regulatory interference. - Anonymity: Users can trade directly from their cryptocurrency wallets without needing to disclose their identity, offering greater privacy. - Accessibility: DEXes are accessible to anyone with an internet connection, removing barriers to entry for users worldwide. - Innovation: The open and permissionless nature of DEXes fosters innovation, allowing for the listing of a wide range of tokens, including those from emerging projects. Popular Decentralized Exchanges - Uniswap: One of the most popular DEXes on the Ethereum blockchain, Uniswap uses an automated market maker (AMM) model to provide liquidity and facilitate trading. - PancakeSwap: A leading DEX on the Binance Smart Chain, offering similar functionalities to Uniswap but with lower transaction fees. - SushiSwap: Initially a fork of Uniswap, SushiSwap has introduced additional features and incentives for liquidity providers and traders. Challenges and Considerations While DEXes offer numerous benefits over their centralized counterparts, there are also challenges to consider: - User Experience: The interface and trading experience on DEXes can be less intuitive for users accustomed to traditional exchanges, potentially deterring mainstream adoption. - Slippage: In markets with low liquidity, large orders can significantly impact prices, leading to slippage and less favorable trade executions. - Smart Contract Risks: As DEXes operate on smart contracts, any vulnerabilities in the contract code can be exploited, posing a risk to users' funds. The Role of DEXes in DeFi DEXes play a crucial role in the DeFi ecosystem by enabling decentralized trading, fostering financial inclusion, and promoting a new era of financial sovereignty. They serve as a testament to the power of blockchain technology in creating more open, transparent, and equitable financial systems. As the DeFi space continues to evolve, DEXes are likely to see further innovation and adoption, challenging traditional financial markets and paving the way for a decentralized financial future. Insurance In the traditional financial world, insurance is a key component of risk management, providing a safety net against unexpected losses. DeFi brings this concept into the blockchain, offering decentralized insurance solutions that leverage smart contracts to automate claims and payouts. This innovation aims to increase transparency, reduce costs, and expand accessibility to insurance products. Mechanism and Benefits DeFi insurance utilizes blockchain technology to manage and execute insurance policies without the need for centralized insurance companies. Smart contracts automatically enforce policy conditions, execute claims, and manage payouts based on predefined rules and triggers. This automation reduces administrative costs and the potential for human error or bias, making the claims process faster and more efficient. The transparency inherent in blockchain technology also plays a crucial role. Policy terms, conditions, and transactions are recorded on a public ledger, providing clear, indisputable records of agreements and transactions. This transparency builds trust among participants and can significantly streamline dispute resolution. Key Features of DeFi Insurance - Customizable Coverage: Users can tailor coverage to their specific needs, choosing from a variety of risks and parameters. - Immediate Payouts: Smart contracts can trigger automatic payouts upon the occurrence of a predefined event, eliminating the waiting period associated with traditional insurance claims. - Access to Underserved Markets: DeFi insurance can provide coverage for risks that are underserved or ignored by traditional insurers, including smart contract failures, exchange hacks, and crypto wallet thefts. Examples of DeFi Insurance Platforms - Nexus Mutual: A decentralized insurance protocol that allows members to pool and share risk. It offers coverage against smart contract failures, which is a significant concern in the blockchain space. - Etherisc: A platform that aims to decentralize insurance applications, allowing developers to build their own insurance products on the blockchain. - Opyn: Provides options-based insurance, allowing DeFi users to hedge against $ETH price drops and other DeFi risks. Challenges and Future Prospects While DeFi insurance holds promising potential, it also faces several challenges, including regulatory uncertainty, the complexity of assessing and pricing blockchain-based risks, and the need for more widespread understanding and adoption. Additionally, the nascent state of DeFi insurance means the market is still developing, with limited data on risk assessment and claims history. Despite these challenges, DeFi insurance represents a significant step forward in democratizing access to financial services. By leveraging blockchain technology, DeFi insurance can offer more transparent, efficient, and inclusive alternatives to traditional insurance, potentially transforming how we manage risk. As the DeFi ecosystem continues to mature, it is expected that DeFi insurance will evolve to address a broader range of risks, attract more users, and integrate more deeply with other DeFi services. This evolution will likely include more sophisticated risk assessment models, enhanced regulatory clarity, and innovative insurance products that further leverage the benefits of DeFi. Margin Trading Margin trading is a critical financial tool that allows traders to leverage their positions, amplifying both potential profits and risks. Within the context of DeFi, margin trading has been redefined to operate on blockchain technology, offering a decentralized approach to leverage and short selling without the need for traditional brokerage houses. Overview In DeFi, margin trading involves borrowing funds to increase the size of a trading position beyond what would be possible with one's own capital alone. This mechanism allows traders to amplify their exposure to market movements, potentially increasing their profits from successful trades. However, it also increases the potential for significant losses, making it a high-risk strategy. How DeFi Margin Trading Works DeFi platforms facilitate margin trading through smart contracts, which automate the borrowing, trading, and settlement processes. Traders can leverage their positions by using cryptocurrencies as collateral to borrow additional funds. This process is typically done in a decentralized manner, with the terms of the leverage, interest rates, and liquidation thresholds all encoded within smart contracts. Key Components of DeFi Margin Trading - Leverage: The ratio of borrowed funds to the trader's own investment. Higher leverage increases potential returns but also risks. - Collateral: The assets deposited by the trader to secure the borrowed funds. In DeFi, collateral is often in the form of cryptocurrency. - Liquidation: If the value of the collateral falls below a certain threshold due to adverse market movements, the position may be automatically liquidated to repay the borrowed funds. Advantages of DeFi Margin Trading - Accessibility: DeFi platforms offer global access to margin trading without the need for traditional brokerage accounts, making it easier for individuals to participate. - Transparency: The use of blockchain technology ensures that all transactions and positions are recorded transparently, providing clear visibility into market operations. - Autonomy: Traders retain control over their funds until the point of trade execution, reducing the risk of third-party mismanagement or interference. Risks and Considerations Margin trading in DeFi, while offering significant opportunities, comes with substantial risks, particularly due to the volatility of cryptocurrency markets. The possibility of rapid price movements can lead to swift liquidation of leveraged positions, resulting in significant losses. Moreover, the complexity of smart contracts and the potential for bugs or vulnerabilities add another layer of risk. Future Outlook As the DeFi ecosystem continues to evolve, margin trading platforms are becoming more sophisticated, offering advanced features such as isolated margin accounts, cross-margin trading, and more diverse lending pools. These developments could attract a wider range of traders and investors, further integrating DeFi margin trading into the broader financial landscape. Advantages of DeFi DeFi offers a suite of advantages over traditional financial systems. These benefits stem from DeFi's foundational use of blockchain technology, which enables more open, transparent, and accessible financial services. Here are some of the key advantages of DeFi: Accessibility and Inclusivity - Global Access: DeFi services are available to anyone with an internet connection, removing geographical barriers to financial services. This global accessibility ensures that individuals in underserved or unbanked regions can participate in the financial ecosystem. - 24/7 Availability: Unlike traditional financial institutions that operate during business hours, DeFi platforms are available around the clock, providing continuous access to financial services. Transparency and Security - Transparent Operations: All transactions and smart contract rules in DeFi are recorded on the blockchain, providing a transparent ledger that anyone can inspect. This openness fosters trust among users and makes the system more resistant to fraud. - Enhanced Security: By distributing data across a blockchain network, DeFi reduces the risk of hacking and fraud associated with centralized databases. The cryptographic security of blockchain further ensures the safety of assets and transactions. Censorship Resistance and Control - Censorship Resistance: DeFi operates without central control, making it difficult for any single entity to freeze accounts or block transactions. This ensures that users retain complete control over their assets. - User Sovereignty: DeFi gives users full control over their financial activities, from trading and lending to borrowing and saving. This autonomy contrasts sharply with the traditional financial system, where institutions often have the final say. Reduced Costs and Efficiency - Lower Fees: By eliminating intermediaries, DeFi significantly reduces the fees associated with financial transactions. This cost efficiency is particularly beneficial for small transactions that would be otherwise uneconomical in the traditional banking system. - Efficient Markets: DeFi platforms facilitate faster transactions and settlements than traditional financial systems, thanks to the automation provided by smart contracts. This efficiency can lead to more dynamic and responsive financial markets. Innovation and Financial Democratization - Rapid Innovation: The open-source nature of DeFi platforms encourages continuous innovation, with developers freely building on existing protocols to create new financial products and services. - Financial Democratization: DeFi levels the playing field, allowing retail investors to access financial instruments and opportunities that were previously available only to institutions or high-net-worth individuals. Disadvantages of DeFi While DeFi offers numerous advantages that challenge the traditional financial system, it also comes with its set of disadvantages and challenges. Understanding these drawbacks is crucial for users and developers alike to navigate the DeFi space responsibly and to work towards mitigating these issues. Technical Complexity and Usability - Steep Learning Curve: DeFi platforms often have complex interfaces and require a good understanding of blockchain technology, which can be daunting for newcomers. - Usability: The user experience (UX) in many DeFi platforms can be less intuitive compared to traditional financial services, potentially hindering wider adoption. Security Risks - Smart Contract Vulnerabilities: Despite the security benefits of blockchain, DeFi is still prone to risks associated with bugs or exploits in smart contract code, leading to significant losses. - Lack of Regulation: The largely unregulated nature of DeFi can expose users to scams, frauds, and other malicious activities without the safeguard of regulatory protection. Market Volatility and Liquidity Issues - High Volatility: The DeFi market is known for its high volatility, which can lead to significant price swings and liquidity issues, affecting the stability and predictability of investments. - Liquidity Concerns: Some DeFi platforms and tokens may suffer from low liquidity, making it difficult to execute large transactions without impacting the market price. Scalability and Performance - Network Congestion: Popular blockchain networks that host DeFi services, like Ethereum, can become congested, leading to slow transaction times and high fees, especially during peak usage. - Scalability Challenges: Current blockchain technology may struggle to scale efficiently to meet the demands of global finance, potentially limiting DeFi's growth and adoption. Regulatory and Legal Uncertainty - Regulatory Ambiguity: The rapidly evolving nature of DeFi and its innovative financial products often outpace current regulatory frameworks, leading to uncertainty and potential future crackdowns. - Compliance Risks: Users and service providers may inadvertently violate financial regulations due to the cross-jurisdictional nature of blockchain, posing legal risks. Interoperability and Integration - Limited Interoperability: While strides are being made, DeFi protocols and platforms still face challenges in interoperability, making it difficult for various services to work seamlessly together. - Integration with Traditional Finance: The integration of DeFi with traditional financial systems remains complex, limiting the potential for broader financial inclusion and efficiency gains. Future Outlook of DeFi The future of DeFi is at the intersection of innovation, regulatory evolution, and increasing mainstream adoption. As DeFi continues to mature, it faces both opportunities for exponential growth and challenges that must be navigated carefully. Here's an exploration of what the future may hold for DeFi: Continued Growth and Innovation - Innovative Financial Products: The DeFi ecosystem is likely to continue its trajectory of rapid innovation, introducing more sophisticated financial instruments that mimic and improve upon traditional finance, including more complex derivatives and insurance products. - Cross-chain and Layer 2 Solutions: To address scalability and interoperability issues, DeFi is expected to increasingly adopt cross-chain technologies and Layer 2 solutions, enabling faster transactions, reduced fees, and seamless interaction between different blockchains. Integration with Traditional Finance - Institutional Adoption: As DeFi proves its value proposition, more institutional investors are expected to enter the space, bringing in significant capital and potentially stabilizing the market. - Hybrid Models: There may be a rise in hybrid finance (HyFi) models that blend the best aspects of traditional finance (TradFi) and DeFi, offering users the advantages of both worlds. Regulatory Clarity and Compliance - Regulatory Frameworks: The development of clear regulatory frameworks tailored to DeFi's unique characteristics is crucial. Regulatory clarity will likely lead to greater adoption by reducing the risks associated with legal uncertainties. - Self-Regulation and Compliance Tools: The DeFi community may develop more robust self-regulatory practices and compliance tools, making it easier for DeFi projects to adhere to global financial regulations. Enhanced Security Measures - Improved Security Protocols: Ongoing efforts to enhance the security of smart contracts and DeFi platforms will be critical in minimizing risks associated with hacks and exploits. - Insurance Mechanisms: The expansion of DeFi insurance products will provide users with protection against potential losses, increasing trust in the ecosystem. User Experience and Accessibility - Simplified User Interfaces: Efforts to improve the user experience (UX) of DeFi platforms will be key in making DeFi more accessible to a broader audience, including those less familiar with blockchain technology. - Education and Awareness: Increased educational initiatives will play a vital role in bridging the knowledge gap, enabling more users to safely navigate the DeFi space. Global Financial Inclusion - Expanding Access: By continuing to lower barriers to entry, DeFi has the potential to further enhance global financial inclusion, providing underserved populations with access to financial services previously out of reach. Radix: The Full Stack for DeFi Radix is a groundbreaking approach to DeFi, which is designed for a future where DeFi becomes a part of everyone's daily life. Through a decade of diligent research, testing, and development, Radix has introduced a suite of custom technologies that seamlessly integrates a wallet, a unique programming language, an efficient execution environment, and a novel consensus algorithm. This comprehensive integration, known as the "Radix Full Stack," sets Radix apart as a radically innovative platform in the DeFi space. The Radix Wallet: Enhancing User Experience The Radix Wallet (/contents/tech/core-protocols/radix-wallet) introduces a paradigm shift in how users interact with DeFi transactions. It aims to address the common issue of complex and confusing transaction approvals in the Web3 ecosystem by providing a user-friendly, understandable preview of transactions. This feature, powered by the Radix Transaction Manifest, ensures that users are fully informed about the actions they approve, significantly reducing the risk of unintended asset losses. Furthermore, Radix revolutionizes account control through Smart Accounts (/contents/tech/core-protocols/smart-accounts) , eliminating the need for traditional seed phrases and offering a secure, non-custodial crypto experience that is as intuitive as using any standard app. The wallet's mobile-first design facilitates seamless interaction with smart contracts and dApps across devices, enhancing accessibility and security for the everyday user. Scrypto: Simplifying DeFi Development At the core of Radix's innovation is Scrypto (/contents/tech/core-protocols/scrypto-programming-language) , an asset-oriented programming language designed to democratize the creation of digital assets and DeFi applications. Scrypto and the Radix Engine (/contents/tech/core-protocols/radix-engine) together place assets, accounts, and permissions at the forefront, making asset security intuitive and straightforward for developers. This approach has already attracted over 12,000 developers worldwide, leading to the creation of diverse DeFi solutions, including decentralized exchanges, oracles, and DAOs. With the Babylon (/contents/tech/releases/radix-mainnet-babylon) upgrade, Radix has enabled smart contracts to be deployed on the Radix Mainnet, further expanding the possibilities for developers to innovate within the Web3 space. Radix Engine: Streamlining DeFi Development The Radix Engine (/contents/tech/core-protocols/radix-engine) introduces a concept similar to that of game engines in the video game industry, by handling the standardizable, low-level aspects of DeFi applications. This "DeFi Engine" allows developers to focus more on the unique features and functionalities that enhance user experiences rather than getting bogged down by underlying technical details. By making DeFi development more accessible and efficient, the Radix Engine encourages the creation of secure, intuitive, and powerful dApps, ensuring a safer ecosystem for users. Cerberus: A New Standard in Consensus Algorithms Completing the Radix Full Stack is Cerberus (/contents/tech/core-protocols/cerberus-consensus-protocol) , a consensus algorithm designed to address the scalability and efficiency challenges faced by existing blockchain networks. Born from extensive research, Cerberus uniquely enables parallel transaction execution while maintaining atomic composability (/contents/tech/core-concepts/atomic-composability) , promising to scale linearly with demand. This innovative approach ensures that the Radix Network can support a high volume of transactions without compromising on speed, cost, or user experience. ## Radix Visuals URL: https://radix.wiki/contents/resources/radix-visuals Updated: 2026-01-27 Radix Protocols Badges Cerberus / Cassandra Developer Resources Native Assets Radix Engine Wallet Features Delegated Fees Personas Radix Connect Smart Accounts Transaction Manifest Transaction Preview Initiatives / Events HyperScale RadQuest ## Radix Foundry Program URL: https://radix.wiki/contents/resources/radix-foundry-program Updated: 2026-01-27 The Radix Foundry Program is an incubator initiative launched in 2024 by Radix Publishing as part of the Radix ecosystem's $10M development fund (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) . The program aims to incubate decentralized applications (dApps) in high-potential categories to drive significant growth in users, Total Value Locked (TVL), and on-chain activity within the Radix ecosystem. Overview Projects participating in the Radix Foundry Program receive up to $250,000 in $XRD tokens along with comprehensive support (https://www.radixdlt.com/blog/introducing-radix-foundry-program) including marketing assistance, business development guidance, and technical mentorship from the RDX Works leadership team. Additional benefits include introductions to venture capital firms, potential partners, exchanges, and market makers, as well as coordinated go-to-market strategies with Radix Publishing. The program was established following early ecosystem successes in decentralized finance (DeFi), with several Radix-based applications achieving notable milestones. By early 2024, decentralized exchanges like CaviarNine, Ociswap, and DeFi Plaza, along with money markets such as WEFT, had accumulated more than $30 million in Total Value Locked (https://www.radixdlt.com/blog/introducing-radix-foundry-program) . Other successful projects included ShardSpace (/ecosystem/shardspace) , Trove (/ecosystem/trove) , and $XRD Domains (/ecosystem/xrd-domains) , which demonstrated sophisticated user experiences in the Web3 space. As of late 2024, the program operates as a highly selective initiative (https://www.radixdlt.com/blog/ecosystem-fund-update-new-directions-for-growth-and-support) , with the ecosystem fund expecting to make up to one investment per quarter in standout projects through the Foundry Program. This selective approach aims to identify and support projects that can significantly impact the growth and adoption of the Radix network while maintaining high quality standards for decentralized applications. The Foundry Program represents a strategic component of Radix's broader ecosystem development strategy, working in conjunction with other funding initiatives like the Booster Grants program to create a comprehensive support system for blockchain developers and entrepreneurs building on the Radix platform. Program Structure The Radix Foundry Program operates through a structured application and support system designed to identify and nurture promising blockchain projects. The application process begins with teams typically applying through the Radix Booster Grant program (https://www.radixdlt.com/blog/introducing-radix-foundry-program) , which serves as an initial entry point for projects seeking support within the Radix ecosystem. - Support Framework: Projects selected for the Foundry Program receive comprehensive support across multiple areas. The program provides market research and insights on business opportunities, mentorship from the RDX Works leadership team, and coordinated go-to-market strategy support with Radix Publishing (https://www.radixdlt.com/blog/introducing-radix-foundry-program) . Additionally, teams receive exclusive co-marketing opportunities and strategic introductions to potential partners, including venture capital firms, exchanges, and market makers. - Integration with Ecosystem Grants: The Foundry Program operates as part of a broader ecosystem support structure. While teams can receive up to $160,000 through the regular Booster Grants program, the Foundry Program provides additional funding of up to $250,000 in $XRD tokens (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) . This integration ensures that projects can access different levels of support based on their development stage and needs. - Selection and Oversight: The program maintains high selectivity, with the ecosystem fund typically making one investment per quarter through the Foundry Program (https://www.radixdlt.com/blog/ecosystem-fund-update-new-directions-for-growth-and-support) . Projects are evaluated based on their potential impact on the Radix ecosystem, with particular attention paid to their ability to drive significant user growth, increase Total Value Locked (TVL), and enhance on-chain activity. - Success Metrics: Projects within the program are evaluated based on their achievement of specific milestones and their contribution to ecosystem growth (https://www.radixdlt.com/blog/250m-xrd-10m-to-power-the-next-web3-breakthrough---the-new-radix-ecosystem-fund) . The program tracks various metrics including user adoption, transaction volume, and liquidity metrics to assess the success of supported projects. Teams are expected to present strong plans for reaching their growth goals in terms of both product development and business strategy. - Technical Integration: Successful projects in the Foundry Program are expected to leverage the full capabilities of the Radix technology stack, particularly focusing on creating mainstream-ready user experiences. This includes utilizing Radix's native asset system, integrating with the Radix Wallet, and implementing other platform-specific features to ensure optimal performance and user experience within the Radix ecosystem. Focus Areas The Radix Foundry Program has identified seven key high-potential categories for development within the Radix ecosystem. These categories were selected based on their potential to drive significant user growth and increase total value locked (TVL) in the network. - Lending and Borrowing Markets: The program prioritizes the development of money markets, recognizing their vital role in DeFi ecosystem success (https://www.radixdlt.com/blog/introducing-radix-foundry-program) . These platforms enable lenders to earn interest on deposited capital while allowing borrowers to gain leverage, leading to improved capital allocation and market efficiency. The program cites successful examples like Aave ($10B TVL) and Compound ($2.2B TVL) from the Ethereum ecosystem as models for potential development on Radix. - Real World Assets (RWA): Real World Asset tokenization represents a significant opportunity in the crypto space (https://www.radixdlt.com/blog/introducing-radix-foundry-program) . The program supports projects that facilitate the tokenization of off-chain assets, enabling issuers to access global, permissionless, 24/7 capital pools. This category aims to leverage Radix's native asset system to create more predictable and transparent asset behaviors compared to other blockchain platforms. - Yield Derivatives: The program recognizes yield derivatives as crucial financial products that allow users to control future cash flows, speculate on interest rate changes, and manage portfolio risks. With approximately $250 million in Liquid Stake Units (LSUs) within the Radix ecosystem (https://www.radixdlt.com/blog/introducing-radix-foundry-program) , yield derivative applications have significant potential for immediate market impact and future growth. - NFT Pro Trading: The program supports the development of sophisticated NFT trading infrastructure (https://www.radixdlt.com/blog/introducing-radix-foundry-program) , going beyond basic marketplace functionality to include professional tools for NFT analysis and trading. This focus area aims to create platforms that enable complex trading strategies, collection analysis, and professional-grade trading features similar to successful platforms like Blur and Tensor. - Launchpad Services: Token launch infrastructure is identified as a key component for ecosystem growth (https://www.radixdlt.com/blog/introducing-radix-foundry-program) . The program supports the development of launchpad services that can help projects navigate legal and regulatory requirements while providing access to liquid capital and marketing support. Successful examples cited include Seedify, DAOMaker, and Decubate. - Stablecoins and Collateralized Debt Positions: The program recognizes the fundamental importance of stablecoins in crypto adoption and supports projects developing both stablecoin solutions and collateralized debt positions (CDPs). This category aims to replicate the success of projects like MakerDAO with Dai (https://www.radixdlt.com/blog/introducing-radix-foundry-program) , which played a crucial role in catalyzing DeFi adoption on Ethereum. - Liquid Staking and Restaking: Building on the approximately $250 million in Liquid Stake Units within the Radix ecosystem (https://www.radixdlt.com/blog/introducing-radix-foundry-program) , the program supports innovation in liquid staking and restaking services. This focus area aims to expand the utility of staked assets and create new use cases for liquid staking derivatives, following the success of projects like Eigenlayer in the broader crypto ecosystem. Notable Projects The Radix Foundry Program announced its first two participant projects in early 2024, selecting initiatives in the perpetual decentralized exchange and NFT trading categories. Surge Surge was selected as the first recipient of a Radix Foundry grant (https://www.radixdlt.com/blog/radix-foundrys-latest-projects---surge-and-trove) to develop a perpetual decentralized exchange (DEX) on the Radix network. The project is being developed by a team with expertise in mathematics, derivatives trading, and quantitative finance. Surge aims to build upon lessons learned from established perpetual platforms such as Drift, GMX, DyDx, and Jupiter. The platform's development addresses a significant market opportunity, as decentralized perpetual markets represent only 2.3% of the total derivatives volume in crypto, compared to 22% for spot trading (https://www.radixdlt.com/blog/radix-foundrys-latest-projects---surge-and-trove) . By February 2024, Surge had already demonstrated significant traction, with over 100 million CAVIAR tokens locked for their initial token allocation (https://www.radixdlt.com/blog/monthly-wins-joining-the-idos-consortium-14m-liquidity-initiative-and-more-ecosystem-success) . Trove Trove was selected as the second Foundry recipient (https://www.radixdlt.com/blog/radix-foundrys-latest-projects---surge-and-trove) to develop professional NFT trading infrastructure. The platform initially launched in late 2023 with basic NFT trading functionality and quickly achieved over 1 million $XRD in traded volume. The project demonstrated early success by facilitating trading for notable NFT collections including Abandoned Scorpions, Radical Robos, $XRD Domains, and ICE404. Under the Foundry Program, Trove is developing an enhanced platform focused on professional traders, featuring real-time data aggregation, visualized analytics, and improved collection liquidity. The upgraded platform plans to include floor price charting tools, complex trade execution, customizable trading interfaces, collection bidding capabilities, and competitive fee structures. The project also aims to support emerging NFT standards within the ecosystem, such as 404-style tokens, dynamic traits, and nestable tokens. Selection Criteria The Radix Foundry Program employs a comprehensive evaluation framework to select participating projects, focusing on both technical capabilities and business potential. - Technical Requirements: Projects must demonstrate the ability to leverage the full capabilities of the Radix technology stack (https://www.radixdlt.com/blog/introducing-radix-foundry-program) to create mainstream-ready user experiences. Technical evaluation includes assessment of the project's ability to utilize Radix's native asset system and integrate effectively with the platform's infrastructure. - Team Assessment: The program seeks founders and entrepreneurs who possess specific qualities (https://www.radixdlt.com/blog/introducing-radix-foundry-program) , including: - Demonstrated business and technical acumen necessary to transform ideas into successful products. - Proven track record in achieving user growth, liquidity, and volume metrics either within the Radix ecosystem or other blockchain platforms. - Ability to identify and capitalize on untapped market opportunities. - Business Model Evaluation: Projects are evaluated based on their potential for sustainable growth and market impact (https://www.radixdlt.com/blog/ecosystem-fund-update-new-directions-for-growth-and-support) . Applications must demonstrate: - Clear revenue generation path or existing revenue streams. - Potential to enhance DeFi activity on the Radix network. - Path to project sustainability beyond grant funding. - Market Opportunity: Projects must present evidence of market demand and growth potential (https://www.radixdlt.com/blog/ecosystem-fund-update-new-directions-for-growth-and-support) . For applications in categories where similar projects exist, proposals must clearly articulate unique selling propositions and specific market opportunities. The program emphasizes the importance of presenting proposals as investment opportunities, focusing on potential value creation and traction in the broader cryptocurrency space. - Development Milestones: Projects seeking support must outline clear development milestones and timelines. The application process includes defining specific project scope and deadlines (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) , which are established collaboratively with each team as part of the grant agreement. Progress against these milestones is monitored and used as a basis for continued support and funding distribution. - Compliance Requirements: All projects must meet basic compliance standards, including successful completion of Know Your Customer (KYC) or Know Your Business (KYB) verification processes (https://www.radixdlt.com/blog/radix-booster-grants-apply-now) before receiving funding. This requirement ensures regulatory compliance and program integrity while maintaining high standards for participating projects. ## Radix Endowment Fund URL: https://radix.wiki/contents/resources/radix-endowment-fund Updated: 2026-01-27 The Radix Endowment Fund is a 1.5 billion $XRD fund established in 2024 (https://www.radixdlt.com/blog/the-radix-endowment-fund) to support long-term ecosystem growth and provide financial stability for entities involved in the development of the Radix platform. The fund is managed by Brevan Howard Digital (https://www.radixdlt.com/blog/radix-endowment-manager-selected) , the digital asset investment platform of Brevan Howard, a global hedge fund managing approximately $34 billion in assets. Overview The fund was created to establish a sustainable source of token generation for ongoing development of the Radix ecosystem while reducing reliance on treasury token sales. A key objective of the endowment is to help drive external investment and co-investment into the Radix ecosystem (https://www.radixdlt.com/blog/the-radix-endowment-fund) . The fund operates through a segregated portfolio company structure, which ring-fences the $XRD assets from the manager's own assets to provide security. The selection process for the fund manager prioritized institutions that could enhance Radix's institutional adoption through product development, partnership building, and technology alignment. The chosen manager, Brevan Howard Digital, manages over $1 billion in digital assets (https://www.radixdlt.com/blog/the-radix-endowment-fund) and has significant trading volume on centralized exchanges, positioning it to support exchange adoption of Radix through an institutional lens. Structure and Management The Radix Endowment Fund operates through a segregated portfolio company structure (https://www.radixdlt.com/blog/the-radix-endowment-fund) , which ensures the $XRD assets are ring-fenced from the manager's own assets while maintaining ownership rights for the contributing entities. This structure provides bankruptcy protection for the fund's assets. Brevan Howard Digital (https://www.radixdlt.com/blog/radix-endowment-manager-selected) , through its BH Digital Solutions business, manages the fund's $XRD-denominated contributions. The assets are held in designated accounts at Copper Custody (https://www.radixdlt.com/blog/copper-integrates-radix-into-institutional-custody-services) , a regulated crypto custodian that provides institutional custody services. Copper hosts over $50 billion in monthly notional trading volume and supports custody for more than 600 digital assets. The fund's staking infrastructure is provided by Twinstake (https://www.radixdlt.com/blog/the-radix-endowment-fund) , which maintains a minimum staking balance requirement as part of the overall strategy. This arrangement allows the fund manager to unstake portions of the position and redeploy $XRD into other strategies as token prices increase or as additional staking occurs through Twinstake. Fund oversight includes multiple boards responsible for fiduciary management and control (https://www.radixdlt.com/blog/radix-endowment-manager-selected) of the relevant Radix assets. The boards prioritize core strategies that are implemented by Brevan Howard Digital as the investment manager. Investment Strategy The Radix Endowment Fund operates with $XRD-based mandates (https://www.radixdlt.com/blog/the-radix-endowment-fund) , requiring the fund manager to ensure returns in $XRD rather than other assets. This strategy aims to grow the endowment's $XRD holdings over time for the benefit of contributing entities. A significant portion of the fund's strategy involves staking through Twinstake nodes. The fund maintains a minimum staking balance requirement (https://www.radixdlt.com/blog/the-radix-endowment-fund) , with the ability to unstake and redeploy $XRD into other alpha-generating strategies as market conditions permit. The initial staking position increased net network staking by approximately 620 million $XRD. The investment strategy is developed and overseen by a leading panel of portfolio managers (https://www.radixdlt.com/blog/the-radix-endowment-fund) tasked with generating returns exceeding standard staking rewards. The strategy integrates with various institutional partners, including custody infrastructure providers, market makers, ETP issuers, tier 1 exchanges, and stablecoin issuers to facilitate commercial partnerships within the broader crypto ecosystem. The fund's approach balances generating returns with supporting ecosystem development, as evidenced by its strategic role in driving institutional adoption (https://www.radixdlt.com/blog/babylon-turns-one-key-achievements-and-the-road-ahead) and market infrastructure improvements. This includes supporting products like Flash Liquidity and partnerships with entities such as Copper Custody and Keyrock. Impact on Radix Network The establishment of the Radix Endowment Fund has had several significant effects on the Radix network's operations and tokenomics. The fund's staking activities reduced network staking emissions from approximately 7.7% to 6.7% per annum (https://www.radixdlt.com/blog/the-radix-endowment-fund) . To accommodate this change and maintain network decentralization, the Radix Foundation reduced its nodes by two, decreasing its staked tokens by approximately 122 million $XRD and creating space in the top 100 validator set. The fund's operations resulted in a net increase of approximately 620 million $XRD in network staking (https://www.radixdlt.com/blog/the-radix-endowment-fund) . This substantial increase in staked tokens has strengthened the network's security while providing a foundation for institutional participation in network validation. Network decentralization considerations (https://www.radixdlt.com/blog/the-radix-endowment-fund) were specifically addressed in the fund's design, with the reduction in Foundation nodes helping to offset the concentration of staked tokens. The fund's structure also supports the long-term development of the Radix ecosystem by providing a sustainable source of funding that reduces reliance on treasury token sales. The fund has contributed to broader ecosystem development through its role in facilitating institutional adoption and strategic partnerships (https://www.radixdlt.com/blog/babylon-turns-one-key-achievements-and-the-road-ahead) . This includes supporting initiatives like Flash Liquidity and fostering relationships with key market infrastructure providers, which collectively aim to increase the network's total value locked (TVL) and overall market participation. ## Proof of Work URL: https://radix.wiki/contents/resources/python-scripts/proof-of-work Updated: 2026-01-27 A basic implementation of the Proof of Work (PoW) algorithm used as Sybil prevention by Bitcoin and other PoW networks. A deeper comparison between PoW and Proof of Stake (PoS) can be found in our article PoW vs PoS: The Next Industrial Revolution (/blog/pow-vs-pos-the-next-industrial-revolution) . Method & Python Script - Install VS Code: https://code.visualstudio.com (https://code.visualstudio.com) or similar application. - In VS Code open a new terminal window by navigating to Terminal > New Terminal. - Install Homebrew (https://brew.sh/) by pasting the following code into the terminal and pressing Enter: /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" - Install Python in the same way: brew install python - Next, install the modules that the script needs (dependencies): pip3 install hashlib datetime - Now, create a project folder and navigate to it in VS Code via File > Open Folder. - Create a new Python file in VS Code via File > New File. Name it something like pow.py (http://pow.py) - This script calculates the hash for the text “hello” with five leading zeros. Copy and paste it into pow.py (http://pow.py) and save it. # v.0.1 from hashlib import sha256 from datetime import datetime def pow(data, zeros, nonce): data = str(data).encode("utf-8") zeros = zeros * "0" t1 = datetime.now() while True: combo = data + f"{nonce}".encode("utf-8") hash_ = sha256(combo).hexdigest() if hash_.startswith(zeros): t2 = datetime.now() return nonce, hash_, t2-t1 else: nonce += 1 output = pow("hello", 5, 0) # 5 = ~ 1 second print(output)- Run the script from the terminal with: python3 pow.py ## DLT Efficiency Metric URL: https://radix.wiki/contents/resources/python-scripts/dlt-efficiency-metric Updated: 2026-01-27 This script treats DLTs like Radix and Ethereum like engines and attempts to measure their efficiency in creating utility. Introduction Mechanical efficiency is measured by dividing Work or Power Output by Power Input: η=WIW=WorkI=Power Inputη=IW​W=WorkI=Power Input​ The following method substitutes Github repos for Work and Google Trends volume as a measure of Power Input. The assumption is that amount of ecosystem activity will be roughly reflected in the search volume. In this version, the number of Github repos for Radix and Ethereum is determined using the comparable search terms “Scrypto Radix” and “Solidity Ethereum”. The search volumes for “Scrypto Radix” are not large enough to register so instead we have used “XRD Radix” and “ETH Ethereum” to maintain equivalence and eliminate searches for other uses of the term ‘Radix’. Repl Run the embedded script here or scroll down for the method. Results DLT Repos Efficiency (η) (23/07/14) Vs Ethereum Search terms (Github) Search terms (Google Trends) Ethereum (/contents/tech/comparisons/radix-vs-ethereum) 12461 8.45 1 “Solidity Ethereum” “ETH Ethereum” Solana 1076 0.79 11x “Rust Solana” “SOL Solana” Avalanche 88 0.09 91x “Solidity Avalanche” “AVAX Avalanche” Cardano 176 0.09 97x “Plutus Cardano” “ADA Cardano” Radix 21 0.07 123x “Scrypto Radix” “XRD Radix” Method & Python Script - Install VS Code: https://code.visualstudio.com (https://code.visualstudio.com) or another IDE. - In VS Code open a new terminal window by navigating to Terminal > New Terminal. - Install Homebrew by pasting the following code into the terminal and pressing Enter: - Copy /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" - Install Python in the same way: - Copy brew install python - Next, install the modules that the script needs (dependencies): - Copy pip3 install requests pytrends - Now, create a project folder and navigate to it in VS Code via File > Open Folder. - Create a new Python file in VS Code via File > New File. Name it something like DLTefficiency.py (http://DLTefficiency.py) - Copy and paste the following script into DLTefficiency.py (http://DLTefficiency.py) and save it. - Copy # v.0.0.3 import requests from pytrends.request import TrendReq from concurrent.futures import ThreadPoolExecutor from tqdm import tqdm # Function to get the number of Github repositories for a search query def get_github_repo_count(session, search_query): url = f"https://api.github.com/search/repositories?q={search_query}" try: response = session.get(url) response.raise_for_status() return (search_query, response.json()['total_count']) except requests.exceptions.HTTPError: return (search_query, None) # Function to get the worldwide popularity score from Google Trends for a search query def get_google_trends_score(pytrends, search_query): pytrends.build_payload([search_query], timeframe='today 1-m') interest_over_time_df = pytrends.interest_over_time() if not interest_over_time_df.empty: return (search_query, interest_over_time_df[search_query].sum()) else: return (search_query, None) # Function to calculate efficiency score for a search query def calculate_efficiency_score(session, pytrends, search_query_pair): github_search_query, google_trends_search_query = search_query_pair with ThreadPoolExecutor(max_workers=2) as executor: github_future = executor.submit(get_github_repo_count, session, github_search_query) google_future = executor.submit(get_google_trends_score, pytrends, google_trends_search_query) github_repo_result, google_trends_result = github_future.result(), google_future.result() if github_repo_result[1] is not None and google_trends_result[1] is not None: return (search_query_pair, github_repo_result[1] / google_trends_result[1]) else: return (search_query_pair, None) # Main program to calculate efficiency scores for different search queries search_queries = [("Rust Solana", "SOL Solana"), ("Plutus Cardano", "ADA Cardano"), ("Solidity Avalanche", "AVAX Avalanche"), ("Scrypto Radix", "XRD Radix"), # ("wasm polkadot", "DOT polkadot") # Github seems to only allow 4 queries. ] baseline_search_query = ("Solidity Ethereum", "ETH Ethereum") with requests.Session() as session: pytrends = TrendReq(hl='en-US', tz=360) github_results, google_results, efficiency_scores = [], [], [] for query in tqdm([baseline_search_query] + search_queries, desc='Calculating Efficiency Scores'): github_result = get_github_repo_count(session, query[0]) google_result = get_google_trends_score(pytrends, query[1]) efficiency_score = calculate_efficiency_score(session, pytrends, query) github_results.append(github_result) google_results.append(google_result) efficiency_scores.append(efficiency_score) print("Github Repository Counts:\n") for query, count in github_results: print(f"{query}: {count}") print("\nGoogle Trends Scores:\n") for query, score in google_results: print(f"{query}: {score}") print("\nEfficiency Scores:\n") baseline_efficiency_score = efficiency_scores[0][1] # Get baseline efficiency score for query, score in efficiency_scores: if score is not None: value_metric = baseline_efficiency_score / score print(f"{query[0].split(' ')[1]}: {score:.2f} (Ethereum = {value_metric:.0f}x {query[0].split(' ')[1]})") else: print(f"Error calculating efficiency score for '{query[0]}' and '{query[1]}'.") - Run the script from the terminal with: Copy python3 DLTefficiency.py ## Terms of Use URL: https://radix.wiki/contents/resources/legal/terms-of-use Updated: 2026-01-27 Please read these terms and conditions carefully before using this site. ALL USERS Terms of website use This terms of use (together with the documents referred to in it) tells you the terms of use on which you may make use of our website https://RADIX.wiki (https://RADIX.wiki) (our site), whether as a guest or a registered user. Use of our site includes accessing, browsing, or registering to use our site. Please read these terms of use carefully before you start to use our site, as these will apply to your use of our site. We recommend that you print a copy of this for future reference. By using our site, you confirm that you accept these terms of use and that you agree to comply with them. If you do not agree to these terms of use, you must not use our site. You may print off one copy, and may download extracts, of any page(s) from our site for your personal use and you may draw the attention of others within your organisation to content posted on our site. You must not modify the paper or digital copies of any materials you have printed off or downloaded in any way, and you must not use any illustrations, photographs, video or audio sequences or any graphics separately from any accompanying text. Our status (and that of any identified contributors) as the authors of content on our site must always be acknowledged. You must not use any part of the content on our site for commercial purposes without obtaining a licence to do so from us or our licensors. If you print off, copy or download any part of our site in breach of these terms of use, your right to use our site will cease immediately and you must, at our option, return or destroy any copies of the materials you have made. Other applicable terms The following additional terms also apply to your use of our site: - Our Privacy Policy (/contents/resources/legal/privacy-policy) , which sets out the terms on which we process any personal data we collect from you, or that you provide to us. By using our site, you warrant that all data provided by you is accurate. Information about us https://RADIX.wiki (https://RADIX.wiki) is a site owned and operated by the RADIX.wiki (http://RADIX.wiki) DAO ("We"). Changes to these terms We may revise these terms of use at any time by amending this page. Please check this page from time to time to take notice of any changes we made, as they are binding on you. Changes to our site We may update our site from time to time, and may change the content at any time. However, please note that any of the content on our site may be out of date at any given time, and we are under no obligation to update it. We do not guarantee that our site, or any content on it, will be free from errors or omissions. Accessing our site Our site is made available free of charge although the provision of the some of the services accessible through it may not be. We do not guarantee that our site, or any content on it, will always be available or be uninterrupted. Access to our site is permitted on a temporary basis. We may suspend, withdraw, discontinue or change all or any part of our site without notice. We will not be liable to you if for any reason our site is unavailable at any time or for any period. You are responsible for making all arrangements necessary for you to have access to our site. You are also responsible for ensuring that all persons who access our site through your internet connection are aware of these terms of use and other applicable terms and conditions, and that they comply with them. Our site is directed to people residing in the United Kingdom. We do not represent that content available on or through our site is appropriate in other locations. If you choose to access our site from outside the United Kingdom, you do so at your own risk. No reliance on information The Site has not been developed to meet your individual requirements and it is therefore your responsibility to ensure that the facilities and functions of the site meet your requirements. We make no representations, warranties or guarantees, whether express or implied, that the content on our site is accurate, complete or up-to-date. The materials are not exhaustive and may not always reflect the most up to date research in all areas. Information on the site presents the more commonly found aspects and for editorial reasons may not include certain substantive and commonly found phenomena. We make no warranty or representation that - the Site will meet your particular requirements; - the results that may be obtained from the use of the Site will be as stated; and - the quality of advice and information will meet your needs or expectations. Limitation of our liability Nothing in these terms of use excludes or limits our liability for death or personal injury arising from our negligence, or our fraud or fraudulent misrepresentation, or any other liability that cannot be excluded or limited by English law. To the extent permitted by law, we exclude all conditions, warranties, representations or other terms which may apply to our site or any content on it, whether express or implied. We will not be liable to any user for any loss or damage, whether in contract, tort (including negligence), breach of statutory duty, or otherwise, even if foreseeable, arising under or in connection with: - use of, or inability to use, our site; or - use of or reliance on any content displayed on our site. If you are a business user, please note that in particular, we will not be liable for: - loss of profits, sales, business, or revenue; - business interruption; - loss of anticipated savings; - loss of business opportunity, goodwill or reputation; or - any indirect or consequential loss or damage. If you are a consumer user, please note that we only provide our site for domestic and private use. You agree not to use our site for any commercial or business purposes, and we have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity. We will not be liable for any loss or damage caused by a virus, distributed denial-of-service attack, or other technologically harmful material that may infect your computer equipment, computer programs, data or other proprietary material due to your use of our site or to your downloading of any content on it, or on any website linked to it. We assume no responsibility for the content of websites linked on our site. Such links should not be interpreted as endorsement by us of those linked websites. We will not be liable for any loss or damage that may arise from your use of them. Viruses We do not guarantee that our site will be secure or free from bugs or viruses. You are responsible for configuring your information technology, computer programmes and platform in order to access our site. You should use your own virus protection software. You must not misuse our site by knowingly introducing viruses, trojans, worms, logic bombs or other material which is malicious or technologically harmful. You must not attempt to gain unauthorised access to our site, the server on which our site is stored or any server, computer or database connected to our site. You must not attack our site via a denial-of-service attack or a distributed denial-of service attack. By breaching this provision, you would commit a criminal offence under the Computer Misuse Act 1990. We will report any such breach to the relevant law enforcement authorities and we will co-operate with those authorities by disclosing your identity to them. In the event of such a breach, your right to use our site will cease immediately. Linking to our site You may link to any page on our site provided you do so in a way that is fair and legal and does not damage our reputation or take advantage of it. You must not establish a link in such a way as to suggest any form of association, approval or endorsement on our part where none exists. You must not establish a link to our site in any website that is not owned by you. Our site must not be framed on any other site. We reserve the right to withdraw linking permission without notice. The website in which you are linking must comply in all respects with the content standards set out in our Acceptable Use Policy. Third party links and resources in our site Where our site contains links to other sites and resources provided by third parties, these links are provided for your information only. We have no control over the contents of those sites or resources. Applicable law If you are a consumer, please note that these terms of use, its subject matter and its formation, are governed by English law. You and we both agree to that the courts of England and Wales will have non-exclusive jurisdiction. However, if you are a resident of Northern Ireland you may also bring proceedings in Northern Ireland, and if you are resident of Scotland, you may also bring proceedings in Scotland. If you are a business, these terms of use, its subject matter and its formation (and any non-contractual disputes or claims) are governed by English law. We both agree to the exclusive jurisdiction of the courts of England and Wales. CONTRIBUTORS Your account and password If you choose to become a contributor to our site, you will choose or be provided with, a user identification code, password as part of our security procedures. You must treat such information as confidential and must not disclose it to any third party. We have the right to disable any user identification code or password, whether chosen by you or allocated by us, at any time, if in our reasonable opinion you have failed to comply with any of the provisions of these terms of use. If you know or suspect that anyone other than you knows your user identification code or password, you must promptly notify us via https://twitter.com/RadixWiki (https://twitter.com/RadixWiki) Intellectual property rights and copyright We recognise that copyright is the backbone of our site. While we are the owner or the licensee of all intellectual property rights in our site The text of our articles is copyrighted by its contributors who formally and completely release all rights to the work as soon as it is uploaded to our site. Once this irrevocable act is complete they no longer have any power over how the work is used since it is then owned by the public as a whole. RADIX.wiki (http://RADIX.wiki) content can be copied, modified, and redistributed if and only if the copied version is made available on the same terms to others and acknowledgment of the authors of the RADIX.wiki (http://RADIX.wiki) used is included. When you merge or split an article, or otherwise move text from one page to another within RADIX.wiki (http://RADIX.wiki) , the page history functionality cannot by itself determine where the content originally came from. If you are copying text within RADIX.wiki (http://RADIX.wiki) , you must put a link to the source page in an edit summary at the destination page. It is encouraged to do the same thing at the source page, and to add notices at the talk pages of both. If you reuse text which you created yourself, the above may still be a good idea to avoid confusion, but it isn't mandatory. Uploading content to our site Whenever you make use of a feature that allows you to upload content to our site, you must comply with the content standards set out in our Acceptable Use Policy (/contents/resources/legal/acceptable-use-policy) . You warrant that any such contribution does comply with those standards, and you will be liable to us and indemnify us for any breach of that warranty. If you are a consumer user, this means you will be responsible for any loss or damage we suffer as a result of your breach of warranty. You retain all of your ownership rights in your content, but you are required to grant us and other users of the site a limited licence to use, store and copy that content and to distribute and make it available to third parties. The rights you license to us are described in the next paragraph (Rights you licence). We also have the right to disclose your identity to any third party who is claiming that any content posted or uploaded by you to our site constitutes a violation of their intellectual property rights, or of their right to privacy. We will not be responsible, or liable to any third party, for the content or accuracy of any content posted by you or any other user of our site. We have the right to remove any posting you make on our site if, in our opinion, your post does not comply with these terms or the content standards set out in our Acceptable Use Policy. The views expressed by other users on our site do not represent our views or values. You are solely responsible for securing and backing up your content. Rights you licence When you upload or post content to our site, you grant the following licenses: - A worldwide, non-exclusive, royalty-free, transferable license to use, reproduce, amend, distribute, prepare derivative works of, display, and perform that content in connection with the services provided by our site and across different media and to promote the site or services; and - A worldwide, non-exclusive, royalty-free, transferable licence to allow third parties to use the content for their purposes. Contact us To contact us, please message us at https://twitter.com/RadixWiki (https://twitter.com/RadixWiki) Thank you for visiting our site. ## Privacy Policy URL: https://radix.wiki/contents/resources/legal/privacy-policy Updated: 2026-01-27 The RADIX.wiki (http://RADIX.wiki) DAO ("We") are committed to protecting and respecting your privacy. This policy (together with our Terms of Use (/contents/resources/legal/terms-of-use) and any other documents referred to on it) sets out the basis on which any personal data we collect from you, or that you provide to us, will be processed by us. Please read the following carefully to understand our views and practices regarding your personal data and how we will treat it. By visiting RADIX.wiki (http://RADIX.wiki) you are accepting and consenting to the practices described in this policy. For the purpose of the Data Protection Act 1998 (the Act), the data controller is the RADIX.wiki (http://RADIX.wiki) DAO. Information we may collect from you We may collect and process the following data about you: Information you give us You may give us information about you by filling in forms on our site or by communicating with us by phone, e-mail or otherwise. This includes information you provide when, for example, search for information, request a downloadable document, ask for a legal advice line quote or call-back, participate in social media functions on our site, enter a competition, promotion or survey and when you report a problem with our site. The information you give us may include your name, e-mail address and phone number, personal description and photograph, and other personal sensitive confidential information. Information we collect about you With regard to each of your visits to our site we may automatically collect the following information: - technical information, including the Internet protocol (IP) address used to connect your computer to the Internet, your login information, browser type and version, time zone setting, browser plug-in types and versions, operating system and platform; - information about your visit, including the full Uniform Resource Locators (URL) clickstream to, through and from our site (including date and time); products you viewed or searched for; page response times, download errors, length of visits to certain pages, page interaction information (such as scrolling, clicks, and mouse-overs), and methods used to browse away from the page and any phone number used to call our customer service number. Information we receive from other sources We may receive information about you if you use any of the other websites we operate or the other services we provide. In this case we will have informed you when we collected that data that it may be shared internally and combined with data collected on this site. We are also working closely with third parties (including, for example, business partners, sub-contractors in technical, payment and delivery services, advertising networks, analytics providers, search information providers, credit reference agencies) and may receive information about you from them. Cookies Our website uses cookies to distinguish you from other users of our website. This helps us to provide you with a good experience when you browse our website and also allows us to improve our site. Uses made of the information We use information held about you in the following ways: Information you give to us We will use this information: - to carry out our obligations arising from any contracts entered into between you and us and to provide you with the information, products and services that you request from us; - to provide you with information about other goods and services we offer that are similar to those that you have already purchased or enquired about; - to provide you, or permit selected third parties to provide you, with information about goods or services we feel may interest you. If you are an existing client, we will only contact you by electronic means (e-mail or SMS) with information about goods and services similar to those which were the subject of a previous sale or negotiations of a sale to you. If you are a new client, and where we permit selected third parties to use your data, we (or they) will contact you by electronic means only if you have consented to this. If you do not want us to use your data in this way, or to pass your details on to third parties for marketing purposes, please tick the relevant box situated on the form on which we collect your data (the registration form); - to notify you about changes to our service; - to ensure that content from our site is presented in the most effective manner for you and for your computer. Information we collect about you We will use this information: - to administer our site and for internal operations, including troubleshooting, data analysis, testing, research, statistical and survey purposes; - to improve our site to ensure that content is presented in the most effective manner for you and for your computer; - to allow you to participate in interactive features of our service, when you choose to do so; - as part of our efforts to keep our site safe and secure; - to measure or understand the effectiveness of advertising we serve to you and others, and to deliver relevant advertising to you; - to make suggestions and recommendations to you and other users of our site about goods or services that may interest you or them. Information we receive from other sources We may combine this information with information you give to us and information we collect about you. We may use this information and the combined information for the purposes set out above (depending on the types of information we receive). Disclosure of your information We may share your personal information with any member of our group, which means our subsidiaries, our ultimate holding company and its subsidiaries, as defined in section 1159 of the UK Companies Act 2006. We may share your information with selected third parties including: - Business partners, suppliers and sub-contractors for the performance of any contract we enter into with them or you. - Analytics and search engine providers that assist us in the improvement and optimisation of our site. We may disclose your personal information to third parties: - In the event that we sell or buy any business or assets, in which case we may disclose your personal data to the prospective seller or buyer of such business or assets. - If the RADIX.wiki (http://RADIX.wiki) DAO or substantially all of its assets are acquired by a third party, in which case personal data held by it about its clients will be one of the transferred assets. - If we are under a duty to disclose or share your personal data in order to comply with any legal obligation, or in order to enforce or apply our Terms of Use (/contents/resources/legal/terms-of-use) and other agreements; or to protect the rights, property, or safety of the RADIX.wiki (http://RADIX.wiki) DAO, our clients, customers, or others. This includes exchanging information with other companies and organisations for the purposes of fraud protection and credit risk reduction. Where we store your personal data The data that we collect from you may be transferred to, and stored at, a destination outside the European Economic Area ("EEA"). It may also be processed by staff operating outside the EEA who work for us or for one of our suppliers. Such staff maybe engaged in, among other things, the fulfilment of your order, the processing of your payment details and the provision of support services. By submitting your personal data, you agree to this transfer, storing or processing. We will take all steps reasonably necessary to ensure that your data is treated securely and in accordance with this privacy policy. All information you provide to us is stored on our secure servers. Any payment transactions will be encrypted using SSL technology. Where we have given you (or where you have chosen) a password which enables you to access certain parts of our site, you are responsible for keeping this password confidential. We ask you not to share a password with anyone. Unfortunately, the transmission of information via the internet is not completely secure. Although we will do our best to protect your personal data, we cannot guarantee the security of your data transmitted to our site; any transmission is at your own risk. Once we have received your information, we will use strict procedures and security features to try to prevent unauthorised access. Your rights You have the right to ask us not to process your personal data for marketing purposes. You can exercise your right to prevent such processing by checking certain boxes on the forms we use to collect your data. You can also exercise the right at any time by messaging us at https://twitter.com/RadixWiki (https://twitter.com/RadixWiki) Our site may, from time to time, contain links to and from the websites of our partner networks, advertisers and affiliates. If you follow a link to any of these websites, please note that these websites have their own privacy policies and that we do not accept any responsibility or liability for these policies. Please check these policies before you submit any personal data to these websites. Access to information The Act gives you the right to access information held about you. Your right of access can be exercised in accordance with the Act. Any access request may be subject to a fee of £10 to meet our costs in providing you with details of the information we hold about you. Changes to our privacy policy Any changes we may make to our privacy policy in the future will be posted on this page. Please check back frequently to see any updates or changes to our privacy policy. Contact Questions, comments and requests regarding this privacy policy are welcomed and should be messaged to us at https://twitter.com/RadixWiki (https://twitter.com/RadixWiki) ## Acceptable Use Policy URL: https://radix.wiki/contents/resources/legal/acceptable-use-policy Updated: 2026-01-27 This Acceptable Use Policy sets out the terms between you and us under which you may access our website https://RADIX.wiki (https://RADIX.wiki) (our site). This acceptable use policy applies to all users of, and visitors to, our site. Your use of our site means that you accept, and agree to abide by, all the policies in this acceptable use policy, which supplement our Terms of Use (/contents/resources/legal/terms-of-use) . RADIX.wiki (http://RADIX.wiki) is operated by the RADIX.wiki (http://RADIX.wiki) DAO (we or us). Prohibited Uses You may use our site only for lawful purposes. You may not use our site: - In any way that breaches any applicable local, national or international law or regulation. - In any way that is unlawful or fraudulent, or has any unlawful or fraudulent purpose or effect. - For the purpose of harming or attempting to harm minors in any way. - To send, knowingly receive, upload, download, use or re-use any material which does not comply with our content standards. - To transmit, or procure the sending of, any unsolicited or unauthorised advertising or promotional material or any other form of similar solicitation (spam). - To knowingly transmit any data, send or upload any material that contains viruses, Trojan horses, worms, time-bombs, keystroke loggers, spyware, adware or any other harmful programs or similar computer code designed to adversely affect the operation of any computer software or hardware. You also agree: - Not to reproduce, duplicate, copy or re-sell any part of our site in contravention of the provisions of our Terms of Use (https://acuwiki.com/terms) . - Not to access without authority, interfere with, damage or disrupt: - any part of our site; - any equipment or network on which our site is stored; - any software used in the provision of our site; or - any equipment or network or software owned or used by any third party. Interactive Services We may from time to time provide interactive services on our site, including, without limitation: - Chat rooms. - Bulletin boards. - Blogs Where we do provide any interactive service, we will provide clear information to you about the kind of service offered, if it is moderated and what form of moderation is used (including whether it is human or technical). We will do our best to assess any possible risks for users (and in particular, for children) from third parties when they use any interactive service provided on our site, and we will decide in each case whether it is appropriate to use moderation of the relevant service (including what kind of moderation to use) in the light of those risks. However, we are under no obligation to oversee, monitor or moderate any interactive service we provide on our site, and we expressly exclude our liability for any loss or damage arising from the use of any interactive service by a user in contravention of our content standards, whether the service is moderated or not. The use of any of our interactive services by a minor is subject to the consent of their parent or guardian. We advise parents who permit their children to use an interactive service that it is important that they communicate with their children about their safety online, as moderation is not foolproof. Minors who are using any interactive service should be made aware of the potential risks to them. Where we do moderate an interactive service, we will normally provide you with a means of contacting the moderator, should a concern or difficulty arise. Content Standards These content standards apply to any and all material which you contribute to our site (contributions), and to any interactive services associated with it. You must comply with the spirit and the letter of the following standards. The standards apply to each part of any contribution as well as to its whole. Contributions must: - Be accurate (where they state facts). - Be genuinely held (where they state opinions). - Comply with applicable law in the UK and in any country from which they are posted. Contributions must not: - Contain any material which is defamatory of any person. - Contain any material which is obscene, offensive, hateful or inflammatory. - Promote sexually explicit material. - Promote violence. - Promote discrimination based on race, sex, religion, nationality, disability, sexual orientation or age. - Infringe any copyright, database right or trade mark of any other person. - Be likely to deceive any person. - Be made in breach of any legal duty owed to a third party, such as a contractual duty or a duty of confidence. - Promote any illegal activity. - Be threatening, abuse or invade another’s privacy, or cause annoyance, inconvenience or needless anxiety. - Be likely to harass, upset, embarrass, alarm or annoy any other person. - Be used to impersonate any person, or to misrepresent your identity or affiliation with any person. - Give the impression that they emanate from us, if this is not the case. - Advocate, promote or assist any unlawful act such as (by way of example only) copyright infringement or computer misuse. Suspension and Termination We will determine, in our discretion, whether there has been a breach of this acceptable use policy through your use of our site. When a breach of this policy has occurred, we may take such action as we deem appropriate. Failure to comply with this acceptable use policy constitutes a material breach of the Terms of Use (https://acuwiki.com/terms) upon which you are permitted to use our site, and may result in our taking all or any of the following actions: - Immediate, temporary or permanent withdrawal of your right to use our site. - Immediate, temporary or permanent removal of any posting or material uploaded by you to our site. - Issue of a warning to you. - Legal proceedings against you for reimbursement of all costs on an indemnity basis (including, but not limited to, reasonable administrative and legal costs) resulting from the breach. - Further legal action against you. - Disclosure of such information to law enforcement authorities as we reasonably feel is necessary. We exclude liability for actions taken in response to breaches of this acceptable use policy. The responses described in this policy are not limited, and we may take any other action we reasonably deem appropriate. Changes to the Acceptable Use Policy We may revise this acceptable use policy at any time by amending this page. You are expected to check this page from time to time to take notice of any changes we make, as they are legally binding on you. Some of the provisions contained in this acceptable use policy may also be superseded by provisions or notices published elsewhere on our site. ## How to Buy $XRD URL: https://radix.wiki/contents/resources/how-to-buy-xrd Updated: 2026-01-27 $XRD is the native currency of the Radix protocol and is used to pay transaction fees. Below you will find a list of exchanges that have $XRD. $XRD Exchanges https://www.bitget.com/price/radix (https://www.bitget.com/price/radix) https://www.bitmart.com/en-US/trade/XRD_USDT?layout=basic (https://www.bitmart.com/en-US/trade/XRD_USDT?layout=basic) https://www.gate.io/trade/XRD_USDT (https://www.gate.io/trade/XRD_USDT) https://hitbtc.com/xrd-to-usdt (https://hitbtc.com/xrd-to-usdt) https://www.kucoin.com/trade/XRD-USDT (https://www.kucoin.com/trade/XRD-USDT) https://www.mexc.com/exchange/XRD_USDT (https://www.mexc.com/exchange/XRD_USDT) https://app.rocketx.exchange/swap/ETHEREUM.ethereum/RADIX.radix/1?from=Ethereum&to=Radix&mode=w (https://app.rocketx.exchange/swap/ETHEREUM.ethereum/RADIX.radix/1?from=Ethereum&to=Radix&mode=w) https://switchere.com/ (https://switchere.com/) ## Brand Assets URL: https://radix.wiki/contents/resources/brand-assets Updated: 2026-01-27 RADIX.wiki (http://RADIX.wiki) brand assets. LogoMarks Logomark: white Logomark: accessible Logomark: transparent PNG LogoTypes Logotype: white Logotype: accessible Logotype: transparent PNG OpenGraph Cards OpenGraph: white OpenGraph: accessible ## test URL: https://radix.wiki/jobs/test Updated: 2026-01-21 Getting Started Start writing your content here...