A recent International Monetary Fund (IMF) report cautions that tokenization could accelerate market stress by removing the “buffer” created by settlement delays, effectively arguing that faster markets may reduce the window for regulatory intervention. It’s seemingly quite a compelling narrative, but one that risks misdiagnosing where risk actually sits in today’s market structure.
Richard Baker, CEO and Founder of Tokenovate, says that any delays in settlement are not a safeguard, but a legacy constraint that can increase counterparty exposure and trap liquidity, particularly in periods of stress.
From this perspective, the real issue is fragmentation, where disconnected systems, manual processes and delayed visibility allow risk to build unchecked.
As tokenisation’s implementation accelerates in 2026 across major market infrastructures, this actually raises a more fundamental question of are we managing risk through better systems, or simply relying on slower ones?
Richard Baker, CEO and Founder of Tokenovate, comments:
“The IMF’s concerns that tokenization could accelerate market stress place significant emphasis on speed, but this risks overlooking where vulnerabilities actually arise within today’s market structure. In practice, risk tends to accumulate between systems, where fragmented data, manual processes and delayed visibility limit the ability to act with precision.”
With any new innovation, industry participants like the IMF need to be patient and carefully understand how these breakthroughs will impact the ecosystem at large. Instead of arriving at hasty conclusions, we must be careful to fully understanding the long-term implications of any tech advancement,
Baker added:
“Settlement cycles were not designed as a mechanism for risk control. They are largely a consequence of operational complexity and, in many cases, extend exposure rather than mitigate it. When trades remain unresolved for several days, counterparty risk builds and liquidity remains unnecessarily constrained.
Baker continued:
“Applying tokenization to the post-trade lifecycle begins to address these structural weaknesses. Automated workflows and synchronized data allow for continuous risk management, supported by greater transparency and consistency across participants.”
While these updates seem impressive, it does not automatically translate into tangible improvements upon current models. More than likely, continuous improvements are needed so that tokenization can achieve its stated objectives for DeFi and TradFi players.
Baker concluded:
“Faster settlement should not be seen as removing safeguards, but as embedding them more effectively within the market itself. The result is a system with lower exposure, clearer visibility and greater resilience, particularly in periods of stress.”
While the full impact of tokenization is unclear at this point, as the industry is still in its early stages, one thing is clear: these digital innovations will fundamentally transform how existing systems work. And regulators along with industry participants such as the IMF need to carefuly monitor these developments so they can respond in more thoughftful and meaingful manner.