Virtual Liquidity Series: Asset and Equity Tokenization On-Demand Webinar
Fill form to unlock content
Error - something went wrong!
Asset and Equity Tokenization Webinar
Thank you!
Watch recordingMorgan Stanley at Work hosted a webinar that discussed equity and asset tokenization with three panelists at the forefront of the asset tokenization landscape. Moderator Chris Betz, Executive Director and Sector Lead for Fintech and Digital Assets in Transformation and Corporate Technology at Morgan Stanley was joined by Amy Caiazza, Securities Regulatory and Complex Transactions Partner at Wilson Sonsini, Christopher Pallotta, CEO and Founder of Templum, and Robert Hershey, Head of Provenance Capital Markets at Figure Technologies Inc. to discuss their views on the current and future landscape of creating liquidity through tokenization.
Private Sector Tokenization: Encourage Education and Caution
Moderator, Chris Betz, began the discussion by underlining the current interest in equity tokenization. “About 2 trillion in assets today are tokenized.”
Regulations and an Emerging Industry
While there appears to be a lot of industry excitement in the potential for equity tokenization, this new technology still needs to fit within the scope of securities laws. Amy Caiazza helped to underline the problem of applying existing regulations to the new technology. Securities regulations “were really developed for paper and maybe electronics” she noted. This means that applying them to newer technologies like blockchain will take time.
Ciazza shared a common encounter from her legal practice working with tokenization industry professionals: The industry professionals say to her “you know what, I’m going to do these transactions on the blockchain. It’s going to be totally liquid because it’s on the blockchain” and Ciazza has to remind them that equity transactions are still most likely subject to securities laws.
As a result of this disconnect, Caiazza believes that “it’s probably going to be longer than we would like it to be before we start seeing the seismic shifts in regulation that allow these transactions to be expedited.”
The potential global impact of tokenization
Due to the nature of the technology, industry professionals may need to consider compliance with an international regulatory framework.
Panelist Bob Hershey noted that “the [blockchain] technology lends itself very easily to be global.” He cautioned, however, that there may be issues with international tractions on the chain complying with local laws and regulations.
Staying on top of global news may be crucial. Ciazza noted that “regulated and government officials are starting to talk about” global regulation of both stable coin and tokenized securities.
Lack of standardization with new technology
In addition to an evolving regulatory framework, tokenization technology may be experiencing a period of growth and change as developers work to find an efficient and effective process. “I think the first challenge for the industry, meaning for blockchain, is to figure out how do you take all these disparate functions and where can you actually digitize it? At what point could you digitize it?” Bob Hershy questioned.
Chris Pallotta, whose company works to answer many of these questions, answered that he sees the technology growing over a period of time. “I think it's a matter of steppingstones to get there.” He noted that he thinks that it’s important the technology does not become too centralized. “There needs to be some sort of check and balance for the problem, process in place.” Developers should be allowed to build, Pallotta added, but there needs to be a check to ensure efficiency.
Both Hershy and Caiazza had differing views where this balance could come from in the future. Hershy suggested that large companies could avoid the risk of inefficient chain technology by exercising due diligence. Caiazza pointed out that eventually, there will likely be more centralization of tokenization to work within the existing regulatory frameworks.
Digital tokenization of assets and equity is a new frontier for regulators, technology companies, and the multitude of industries that see this technology as a provider of liquidity in normally illiquid assets. However, overall, Moderator Betz said it best when he noted that the key takeaway from the discussion is that tokenization is “a much more nuanced conversation.” The technology is “broad” (enough) and “ethereal to a certain extent” that it requires education, so that would-be adopters can get “specific about what the impact is going to be for whatever area that they're dealing with.”