The best tokenization business might be the one that sees the same asset five times
Think about one tokenized office building.
It gets priced once when the tokens are created. It gets priced again if someone wants to borrow against it. It gets priced again when those tokens trade in the secondary market. It gets priced again for investor reporting. If the asset gets sold or liquidated, it gets priced again there too.
That is five separate checkpoints for the same asset, and each one can matter to revenue.
That is why I keep coming back to valuation as one of the most interesting parts of tokenization. A lot of people focus on issuance because that is the headline event. The longer business may sit in the repeat pricing work that comes after. Per the valuation presentation, those recurring checkpoints are issuance, collateral, trading, reporting, and liquidation. If a provider sits in all five, the asset can keep generating paid work long after the token is launched.
The numbers make that easier to see. Traditional real estate valuation can take 2-5 days and cost about $300-$2k per appraisal, per the presentation. Art can take 1-4 weeks. Collectibles can cost about $500-$5k or more depending on complexity. That is workable for occasional deals. It does not fit a market that wants faster lending decisions, more active trading, and regular valuation updates.
The AI side looks a lot different. The same presentation compares traditional valuation at 2-5 days with AI valuation under 1 minute. Cost drops from about $300-$2k to about $5-$50. If those economics hold up in practice, the pricing layer goes from an occasional service to something the market can use constantly. That matters because tokenized markets do not stop moving after day one.
This is also where scale starts to matter. The presentation uses a $10T-$16T market forecast for tokenized real-world assets by 2030. Even if that range ends up too high, the direction is clear. A market that large is going to need far more valuation work than the current manual system can handle. If each asset keeps coming back for fresh pricing across multiple stages, the revenue opportunity shifts toward the firms embedded in that cycle.
That is one reason DVLT stands out to me. The company talks about DataValue and DataScore, which puts valuation and scoring close to the middle of the story. It also operates as a data broker and data monetization business. That matters because repeated valuation gets stronger when the model has more useful data behind it. A company with its own data supply has a better chance of producing pricing the market will keep using.
There is also a market structure angle behind it. On Mar 18, Nasdaq got SEC approval to allow certain securities to trade in tokenized form. On Mar 19, DVLT announced its NYIAX deal, per company releases. That is why I think the valuation story fits DVLT better than people assume. The company is talking about pricing and scoring while also moving closer to exchange-linked tokenization infrastructure.
So when I look at tokenization, I do not only ask who can issue the token. I ask who keeps getting called every time that asset needs a fresh value. The company that sees the same asset five times may end up with a better business than the one that only sees it once.
My opinion only. Not financial advice.