Research Methodology and Data Framework
Updated March 2026
Capital Tokenization employs a multi-source research methodology designed for institutional-grade intelligence on capital markets tokenization. Our analytical framework integrates on-chain data, regulatory filings, corporate disclosures, and primary research to deliver accurate, verifiable intelligence. Every data point published on this site traces back to a named source, and we maintain full audit trails for all quantitative claims.
Data Sources
Central Bank and Multilateral Institution Research
Our primary quantitative foundation comes from central bank and multilateral institution research, which represents the highest-credibility source tier for capital markets analysis.
The Bank for International Settlements (BIS) (bis.org) is the single most authoritative source for institutional tokenization analysis. BIS working papers, quarterly reviews, and Financial Stability Board reports provide foundational data on cross-border tokenization adoption, settlement efficiency metrics, and systemic risk analysis. We specifically reference BIS research on the macroeconomic implications of tokenized collateral velocity, DLT settlement finality, and interoperability between tokenized platforms. BIS analysis informs our institutional infrastructure coverage and the theoretical framework underlying our settlement infrastructure status tracker.
The World Economic Forum (WEF) produces annual forecasts and cross-industry analysis of tokenization’s systemic impact. WEF research projects $15-20 billion in annual operational cost savings across the financial industry from tokenization of capital market infrastructure — data we cite in our fixed-income research and private markets analysis. WEF’s stakeholder methodology provides qualitative depth on institutional adoption barriers that purely quantitative sources cannot capture.
The International Organization of Securities Commissions (IOSCO) published its November 2025 report on “Tokenization of Financial Assets” (FR/17/25), which provides authoritative data on JPMorgan Kinexys transaction volumes ($2 trillion+ processed as of March 2025), cross-jurisdictional regulatory harmonisation progress, and securities-law treatment of tokenized instruments. This report is a primary source for our JPMorgan Onyx entity profile and repo tokenization coverage.
Investment Bank and Consulting Research
McKinsey & Company provides the most-cited market sizing projections in institutional tokenization. McKinsey’s forecast of $4-5 trillion in digital securities issuance by 2030, drawn from their Global Banking Practice research, underpins the growth trajectory analysis in our bond tokenization section and serves as the reference benchmark in our capital markets tokenization report. McKinsey’s methodology — combining bottom-up institutional survey data with top-down market structure modelling — provides a transparent basis for scenario analysis.
JPMorgan research on tokenized collateral efficiency and repo market structure informs our Broadridge DLR analysis and Canton Network coverage. JPMorgan’s corporate disclosures, earnings call transcripts, and Kinexys platform reports are systematically monitored and cross-referenced against IOSCO and BIS data.
Goldman Sachs Digital Assets Platform reports and GS DAP transaction data inform our Goldman Sachs entity profile and European Investment Bank digital bond coverage. Goldman’s co-manager disclosures on EIB tokenized note issuances provide verifiable transaction-level data that supplements on-chain records.
Industry Association and Platform Reports
Broadridge Financial Solutions provides the most detailed publicly available data on institutional repo tokenization through its Distributed Ledger Repo (DLR) platform. Broadridge’s press releases and public filings confirm $385 billion in average daily tokenized repo transactions as of October 2025 — making Broadridge DLR the world’s largest institutional tokenized repo platform by volume. We cross-reference Broadridge’s self-reported figures against JPMorgan Kinexys integration announcements, which independently corroborate the scale of on-chain repo settlement.
Broadridge’s industry whitepaper “Next-Gen Markets: The Rise and Reality of Tokenization” (broadridge.com) provides methodology benchmarks for assessing tokenization maturity across market segments, which we apply in our comparative analysis framework.
GFMA (Global Financial Markets Association) reports on DLT’s impact on capital markets provide regulatory-level validation of tokenization adoption metrics across fixed income, equity, and derivatives markets.
EY survey data — specifically their survey finding that 91% of high-net-worth investors plan to allocate to tokenized bonds by 2026 and 83% of institutional investors plan allocations — is cited in our bond tokenization market sizing and private markets projections. EY’s methodology is institutional survey-based, providing demand-side data that complements the supply-side metrics from platform reports.
On-Chain Data Collection
For tokenized asset metrics that settle on public or permissioned blockchains, we maintain direct indexing pipelines.
Ethereum-based instruments — including tokenized U.S. Treasuries from Ondo Finance (USDY, $1.21B+), Franklin Templeton BENJI ($1.01B+), and Hashnote — are tracked through contract-level event logs. We monitor mint, burn, and transfer events to calculate real-time AUM, holder counts, and velocity metrics. Ethereum’s $7.5 billion institutional RWA market share (59% of total tokenized RWA excluding stablecoins) is derived from aggregating ERC-20 token supply across verified institutional contract addresses, cross-referenced with on-chain analytics from Dune Analytics and DefiLlama.
BlackRock BUIDL AUM ($2.01B+) is tracked via on-chain treasury data (BUIDL token supply on Ethereum) and BlackRock’s fund reporting. The $20 billion total RWA TVL figure (excluding stablecoins) aggregates position data across the 73 tracked tokenized treasury products with 55,520+ collective holders.
Canton Network activity is tracked through participant disclosures, as the network’s privacy-preserving architecture limits public on-chain observability. We cross-reference Goldman Sachs GS DAP issuance announcements, Broadridge DLR integration updates, and BNY Mellon custody reports to reconstruct Canton Network activity without relying on direct blockchain data access.
Tokenized repo volumes on Broadridge DLR ($385B average daily) are sourced from Broadridge public filings. JPMorgan Kinexys ($2T+ total processed transactions) draws from IOSCO FR/17/25. These two figures are maintained in our repo tokenization volume tracker with quarterly update cycles.
Regulatory Filing Analysis
Securities regulators increasingly publish tokenization-specific guidance, exemption orders, and sandbox reports that provide both qualitative regulatory context and quantitative market data.
We systematically monitor:
- SEC Division of Corporation Finance no-action letters and exemptive orders relating to tokenized securities, blockchain-based transfer agents, and digital custody
- ESMA DLT Pilot Regime registry of admitted market participants and admitted DLT financial instruments, updated quarterly
- FCA Digital Securities Sandbox participant disclosures and progress reports
- MAS (Monetary Authority of Singapore) Project Guardian reports and fintech regulatory notices
- FINMA DLT trading facility licences and associated supervisory communications
- EU MiCA Regulation implementation status and asset reference token registrations
These filings inform our regulatory compliance coverage and feed directly into our comparison analyses between jurisdictions. The GENIUS Act (signed July 18, 2025) establishing payment stablecoin guardrails and the Clarity Act defining SEC/CFTC jurisdiction for digital assets are tracked through legislative text analysis and congressional testimony.
Analytical Framework
We analyse capital markets tokenization through four lenses:
1. Market structure impact: How tokenization changes trading, settlement, custody, and collateral management. This includes T+2 to T+0 settlement transition analysis, collateral velocity improvements (BIS estimates $100 billion+ in capital freed annually through efficient collateral management), and market microstructure effects of 24/7 tokenized secondary markets.
2. Economic efficiency: Cost reduction, capital velocity, and collateral optimisation at the institutional level. The WEF’s $15-20 billion annual operational savings estimate, Broadridge’s documented transaction cost reductions, and JPMorgan’s intraday liquidity improvements provide the quantitative basis for efficiency analysis.
3. Regulatory evolution: How securities regulators adapt legal frameworks to accommodate tokenized instruments while maintaining investor protections. We track regulatory developments across 15+ jurisdictions to map the global regulatory landscape for tokenized capital markets.
4. Competitive positioning: Which institutions are gaining strategic advantage through early tokenization deployment. Our entity profiles combine competitive intelligence with transaction data to assess strategic positioning of JPMorgan Onyx, Goldman Sachs GS DAP, HSBC Orion, BNY Mellon, BlackRock BUIDL, Broadridge, Canton Network, and SWIFT.
Verification Standards
All figures undergo verification against at least two independent sources before publication. Where only single-source data exists (common for proprietary platform volumes), we note the sourcing limitation explicitly. We distinguish between self-reported institutional data and independently verified on-chain data throughout our analysis. Self-reported figures are annotated as such when they appear in dashboard trackers and entity profiles.
Historical data is maintained in structured databases with full provenance tracking. Corrections are issued transparently with original figures preserved alongside updated data. Our fixed-income analysis and infrastructure coverage undergo quarterly data audits. The settlement infrastructure status dashboard and repo volume tracker are updated weekly from primary source data.
For data points derived from BIS, McKinsey, WEF, IOSCO, or Broadridge research, we cite the specific publication title, date, and document number in footnotes or inline citations. For on-chain data, we provide the contract address and indexing methodology on request.
Research Independence
Capital Tokenization maintains editorial independence from all covered institutions. We accept no payment, sponsorship, or preferential access from JPMorgan, Goldman Sachs, BlackRock, BNY Mellon, SWIFT, Broadridge, or any other covered entity. Advertising revenue through Google AdSense supports operations without influencing editorial decisions. We disclose any material conflicts of interest in individual analyses where they arise.
Our methodology is documented publicly precisely because institutional-grade intelligence requires verifiability. Readers should be able to trace any quantitative claim on this site back to its primary source, assess the source’s credibility, and evaluate whether our interpretation of that data is well-founded. Methodology questions and data corrections should be directed to info@capitaltokenization.com.