At the Singapore FinTech Festival in November 2025, Andrea M. Maechler, Vice President of the Bank for International Settlements (BIS) and Acting Head of the BIS Innovation Hub, delivered a thought-provoking speech focusing on the key trend of “tokenization” that is reshaping the global financial system.
In her speech, she not only clarified the definition and mechanisms of tokenization but also revealed how it can drive payment system innovation and foster new business models through programmable platforms. We believe her remarks also provide regulators with an important framework for consideration.
1. What is Tokenization? From Static Records to Dynamic Programmable Assets
In simple terms, tokenization is the process of transforming the static ownership records of financial assets—such as deposits, bonds, bills, etc.—into verifiable, transferable digital tokens that can operate on programmable platforms. This process is not merely digitalization; it involves using blockchain or distributed ledger technology to endow assets with a digital identity in the form of tokens, making them divisible, traceable, and programmable.
In traditional financial systems, asset records are often centralized, isolated, and outdated. For example, a cross-border payment may require multiple banks and clearing systems, involving message exchanges, reconciliation, and settlement, taking days and incurring high costs. Tokenization creates digital tokens that correspond one-to-one with underlying assets, secured by cryptography, enabling assets to flow and interact in real-time on open, programmable platforms.
Programmability is one of the core features of tokenization. This means tokens can not only represent value but also embed smart contracts—automatically executing code logic. For example, tokens can be programmed to automatically pay interest at a specific time or transfer ownership automatically when certain conditions are met. This programmability brings unprecedented automation and precision to financial transactions.
2. How to Tokenize? Mechanisms, Platforms, and Typical Cases
Implementing tokenization is not an overnight process; it depends on several key elements: asset onboarding, token issuance, platform integration, and compliance frameworks. Maechler illustrated the practical pathway of tokenization in cross-border payments using the BIS Innovation Hub’s “Agorá project” as an example.
(a) Technical Pathways for Tokenization
Asset Identification and Anchoring. First, it is necessary to clearly identify the assets to be tokenized and establish a reliable, auditable correspondence between them and the digital tokens. This is typically guaranteed through legal agreements and technical credentials.
Token Minting and Issuance. On compliant programmable platforms (such as permissioned blockchains), tokens representing the assets are issued. These tokens must adhere to relevant financial and securities regulations.
Platform Integration and Interoperability. Tokens need to operate on platforms supporting smart contracts, enabling interoperability with other tokens, payment systems, and traditional ledgers.
Automated Clearing and Settlement. Smart contracts can bundle transaction instructions, asset transfers, and fund settlements to achieve “atomic settlement”—where all steps succeed or fail together—eliminating counterparty risk.
(b) The Agorá Project: How Tokenization Reshapes Cross-Border Payments
The Agorá project is a flagship initiative driven by BIS in collaboration with seven central banks and over 40 financial institutions. It integrates tokenized deposits (digital representations of commercial bank deposits) and tokenized reserves (digital representations of central bank money) on the same programmable platform. In cross-border payment scenarios, the payer’s tokenized deposit and the payee’s tokenized deposit can be exchanged instantly via smart contracts, with final settlement completed in real-time using central bank money, all within a single atomic operation.
This approach significantly reduces delays, costs, and risks associated with cross-border payments, while enhancing transparency and traceability. Maechler pointed out that such experiments provide critical technological and governance models for large-scale future applications.
3. The Value of Tokenization: Efficiency Gains and Emergence of New Business Models
The high regard for tokenization by BIS and other international organizations stems from its ability to create value across multiple dimensions:
(a) Efficiency Improvements and Cost Reductions
In traditional finance, reconciliation, clearing, and settlement involve extensive manual and intermediary operations. Tokenization, through automation and atomic settlement, drastically shortens processing times and reduces operational risks and compliance costs. Maechler emphasized that especially in cross-border contexts, this efficiency gain could bring “significant systemic benefits.”
(b) New Business Models and Financial Products
Tokenization is opening up a range of unprecedented application scenarios:
Tokenization of Bonds. The global government bond market is approximately $80 trillion; tokenization could automate issuance, trading, interest payments, and redemptions, enhancing liquidity and lowering entry barriers.
AI and IoT Payments. Programmable tokens can be combined with AI agents to enable real-time, high-frequency micro-payments between machines (e.g., automatic electric vehicle charging payments) or automate invoice settlements in trade finance.
Digitization of Traditional Instruments. For example, BIS and the World Bank are collaborating on the “Promise Project,” which aims to tokenize paper promissory notes used by governments to fund multilateral development banks, improving capital efficiency and transparency.
Tokenization can also reach underserved areas of traditional finance. By lowering transaction costs and increasing trustworthiness, it facilitates easier participation of SMEs and individual investors in global financial markets, while enhancing anti-money laundering and anti-corruption efforts through traceability.
4. The Deeper Significance of Maechler’s Discourse: Logic, Operations, Fraud Prevention, and Regulation
Maechler’s speech is not only a description of technological trends but also implies a comprehensive framework for understanding tokenization:
Revealing the Underlying Logic of Tokenization. She clearly states that the essence of tokenization is reconstructing financial processes through programmability and composability. This involves not only technological upgrades but also systemic reflection on the roles of financial intermediaries, the nature of money, and contract execution methods.
Clarifying Operational Pathways for Asset Tokenization. As seen in projects like Agorá, successful tokenization currently relies on central bank money as the final settlement asset to ensure creditworthiness; it must be promoted on regulated, interoperable platforms; and attention should be paid to integration with traditional systems to avoid fragmentation.
Proposing Fraud Prevention and Quality Assurance Mechanisms for Tokenized Assets. Tokenization does not automatically solve trust issues. Maechler implied that ensuring the authenticity and quality of tokens requires: robust legal declarations and asset backing; transparent issuance and redemption mechanisms; independent audits and on-chain verification tools; and regulatory oversight of issuers and platforms.
Providing Regulatory Insights for Tokenized Assets. She mentioned that recent regulatory developments concerning tokenized currencies (such as stablecoins) lay a legal foundation for broader asset tokenization. Regulation should focus on: clarifying legal status and investor protections; preventing fragmentation and systemic risks; and encouraging cross-jurisdictional cooperation and standardization.
5. Challenges and Future Outlook
Despite the momentum, Maechler acknowledged that the transition is still in its early stages. Deployment of tokenized deposits remains limited, and large-scale adoption faces multiple challenges, including technological interoperability, legal certainty, and regulatory coordination. Additionally, balancing innovation with financial stability and designing platforms that are both open and secure remain unresolved issues.
However, the direction is clear. Tokenization represents a more efficient, transparent, and inclusive financial future. As Maechler emphasized, this is not just technological evolution but a paradigm shift in financial infrastructure. Central banks, commercial banks, tech companies, and regulators must collaborate carefully to ensure that this transformation truly serves the stability and development of the global economy.
Through the forward-looking insights and experimental initiatives of its senior leadership, BIS provides an authoritative and clear roadmap for understanding tokenization. Tokenization is not a distant sci-fi scenario but an unfolding financial reality. It redefines how assets flow, reconstructs trust mechanisms, and reshapes the boundaries of financial services. For policymakers, financial institutions, and market participants, understanding what tokenization is and how to implement it has become an essential course for embracing the next era of finance.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Vice President of the Bank for International Settlements: What is tokenization and how to tokenize?
Author: Zhang Feng
At the Singapore FinTech Festival in November 2025, Andrea M. Maechler, Vice President of the Bank for International Settlements (BIS) and Acting Head of the BIS Innovation Hub, delivered a thought-provoking speech focusing on the key trend of “tokenization” that is reshaping the global financial system.
In her speech, she not only clarified the definition and mechanisms of tokenization but also revealed how it can drive payment system innovation and foster new business models through programmable platforms. We believe her remarks also provide regulators with an important framework for consideration.
1. What is Tokenization? From Static Records to Dynamic Programmable Assets
In simple terms, tokenization is the process of transforming the static ownership records of financial assets—such as deposits, bonds, bills, etc.—into verifiable, transferable digital tokens that can operate on programmable platforms. This process is not merely digitalization; it involves using blockchain or distributed ledger technology to endow assets with a digital identity in the form of tokens, making them divisible, traceable, and programmable.
In traditional financial systems, asset records are often centralized, isolated, and outdated. For example, a cross-border payment may require multiple banks and clearing systems, involving message exchanges, reconciliation, and settlement, taking days and incurring high costs. Tokenization creates digital tokens that correspond one-to-one with underlying assets, secured by cryptography, enabling assets to flow and interact in real-time on open, programmable platforms.
Programmability is one of the core features of tokenization. This means tokens can not only represent value but also embed smart contracts—automatically executing code logic. For example, tokens can be programmed to automatically pay interest at a specific time or transfer ownership automatically when certain conditions are met. This programmability brings unprecedented automation and precision to financial transactions.
2. How to Tokenize? Mechanisms, Platforms, and Typical Cases
Implementing tokenization is not an overnight process; it depends on several key elements: asset onboarding, token issuance, platform integration, and compliance frameworks. Maechler illustrated the practical pathway of tokenization in cross-border payments using the BIS Innovation Hub’s “Agorá project” as an example.
(a) Technical Pathways for Tokenization
Asset Identification and Anchoring. First, it is necessary to clearly identify the assets to be tokenized and establish a reliable, auditable correspondence between them and the digital tokens. This is typically guaranteed through legal agreements and technical credentials.
Token Minting and Issuance. On compliant programmable platforms (such as permissioned blockchains), tokens representing the assets are issued. These tokens must adhere to relevant financial and securities regulations.
Platform Integration and Interoperability. Tokens need to operate on platforms supporting smart contracts, enabling interoperability with other tokens, payment systems, and traditional ledgers.
Automated Clearing and Settlement. Smart contracts can bundle transaction instructions, asset transfers, and fund settlements to achieve “atomic settlement”—where all steps succeed or fail together—eliminating counterparty risk.
(b) The Agorá Project: How Tokenization Reshapes Cross-Border Payments
The Agorá project is a flagship initiative driven by BIS in collaboration with seven central banks and over 40 financial institutions. It integrates tokenized deposits (digital representations of commercial bank deposits) and tokenized reserves (digital representations of central bank money) on the same programmable platform. In cross-border payment scenarios, the payer’s tokenized deposit and the payee’s tokenized deposit can be exchanged instantly via smart contracts, with final settlement completed in real-time using central bank money, all within a single atomic operation.
This approach significantly reduces delays, costs, and risks associated with cross-border payments, while enhancing transparency and traceability. Maechler pointed out that such experiments provide critical technological and governance models for large-scale future applications.
3. The Value of Tokenization: Efficiency Gains and Emergence of New Business Models
The high regard for tokenization by BIS and other international organizations stems from its ability to create value across multiple dimensions:
(a) Efficiency Improvements and Cost Reductions
In traditional finance, reconciliation, clearing, and settlement involve extensive manual and intermediary operations. Tokenization, through automation and atomic settlement, drastically shortens processing times and reduces operational risks and compliance costs. Maechler emphasized that especially in cross-border contexts, this efficiency gain could bring “significant systemic benefits.”
(b) New Business Models and Financial Products
Tokenization is opening up a range of unprecedented application scenarios:
Tokenization of Bonds. The global government bond market is approximately $80 trillion; tokenization could automate issuance, trading, interest payments, and redemptions, enhancing liquidity and lowering entry barriers.
AI and IoT Payments. Programmable tokens can be combined with AI agents to enable real-time, high-frequency micro-payments between machines (e.g., automatic electric vehicle charging payments) or automate invoice settlements in trade finance.
Digitization of Traditional Instruments. For example, BIS and the World Bank are collaborating on the “Promise Project,” which aims to tokenize paper promissory notes used by governments to fund multilateral development banks, improving capital efficiency and transparency.
© Financial Inclusion and Market Integrity
Tokenization can also reach underserved areas of traditional finance. By lowering transaction costs and increasing trustworthiness, it facilitates easier participation of SMEs and individual investors in global financial markets, while enhancing anti-money laundering and anti-corruption efforts through traceability.
4. The Deeper Significance of Maechler’s Discourse: Logic, Operations, Fraud Prevention, and Regulation
Maechler’s speech is not only a description of technological trends but also implies a comprehensive framework for understanding tokenization:
Revealing the Underlying Logic of Tokenization. She clearly states that the essence of tokenization is reconstructing financial processes through programmability and composability. This involves not only technological upgrades but also systemic reflection on the roles of financial intermediaries, the nature of money, and contract execution methods.
Clarifying Operational Pathways for Asset Tokenization. As seen in projects like Agorá, successful tokenization currently relies on central bank money as the final settlement asset to ensure creditworthiness; it must be promoted on regulated, interoperable platforms; and attention should be paid to integration with traditional systems to avoid fragmentation.
Proposing Fraud Prevention and Quality Assurance Mechanisms for Tokenized Assets. Tokenization does not automatically solve trust issues. Maechler implied that ensuring the authenticity and quality of tokens requires: robust legal declarations and asset backing; transparent issuance and redemption mechanisms; independent audits and on-chain verification tools; and regulatory oversight of issuers and platforms.
Providing Regulatory Insights for Tokenized Assets. She mentioned that recent regulatory developments concerning tokenized currencies (such as stablecoins) lay a legal foundation for broader asset tokenization. Regulation should focus on: clarifying legal status and investor protections; preventing fragmentation and systemic risks; and encouraging cross-jurisdictional cooperation and standardization.
5. Challenges and Future Outlook
Despite the momentum, Maechler acknowledged that the transition is still in its early stages. Deployment of tokenized deposits remains limited, and large-scale adoption faces multiple challenges, including technological interoperability, legal certainty, and regulatory coordination. Additionally, balancing innovation with financial stability and designing platforms that are both open and secure remain unresolved issues.
However, the direction is clear. Tokenization represents a more efficient, transparent, and inclusive financial future. As Maechler emphasized, this is not just technological evolution but a paradigm shift in financial infrastructure. Central banks, commercial banks, tech companies, and regulators must collaborate carefully to ensure that this transformation truly serves the stability and development of the global economy.
Through the forward-looking insights and experimental initiatives of its senior leadership, BIS provides an authoritative and clear roadmap for understanding tokenization. Tokenization is not a distant sci-fi scenario but an unfolding financial reality. It redefines how assets flow, reconstructs trust mechanisms, and reshapes the boundaries of financial services. For policymakers, financial institutions, and market participants, understanding what tokenization is and how to implement it has become an essential course for embracing the next era of finance.