Organized by: Hash Global Recently, KK, founder of Hash Global, and Ms. Peng Jing, global CEO of Noah Olive, exchanged views on the development stages of Web3, Organized by: Hash Global Recently, KK, founder of Hash Global, and Ms. Peng Jing, global CEO of Noah Olive, exchanged views on the development stages of Web3,

Hash Global Founder: BNB Value, Digital Asset Allocation, and Web3 Development Stages

2025/12/12 21:00

Organized by: Hash Global

Recently, KK, founder of Hash Global, and Ms. Peng Jing, global CEO of Noah Olive, exchanged views on the development stages of Web3, the value of the BNB ecosystem, and digital asset allocation at the Noah Black Diamond Customer Annual Meeting. The content is as follows:

Jing: Hash Global has been investing in Web3 since 2018, focusing on both early-stage projects and the secondary market. We were an early node on BNB Chain and also launched a crypto fund targeting institutions and families. You could say we've experienced the industry "full-cycle and full-chain." Looking back from 2018 to today, what in the crypto industry has truly been disproven, and what long-term direction have you become increasingly confident in?

KK : What I think has truly been disproven is the over-construction driven by a visionary vision but detached from actual needs. Web3 has many things built on the premise of "I think the metaverse needs it, so I'm building it." This infrastructure will be abandoned, like the excessive undersea cables laid in the early days of the internet. In recent years, the Web3 industry has seen many meaningful attempts, such as socialfi and gamefi, but none have yet taken off. I think these are examples of temporary disproven developments. I believe there will be another wave, and perhaps a third wave. Many things that were initially unworkable or immature will naturally become mature and feasible once the business environment and policies are ready, and there are enough genuine Web3 product users. In my view, the technology is already there; now it's about finding the weaknesses in existing business models and the real pain points in the primary dimension, and having the right teams create new models. The AI industry is the same; development is a spiral. Many things that are temporarily "disproven" are simply things done too early. You couldn't have done it if you had started Yahoo in 1990 or Google in 1996, no matter how strong the team was. This is an objective law of development.

Over the past few years, we've witnessed this industry, despite occasional bubbles and misinformation, steadily progressing and making significant strides every year. I often say that the Web3 industry is too close to "money," making entrepreneurs easily distracted, unlike AI entrepreneurs who are more focused. Some former entrepreneurs may have left the industry, but many more are entering. I recently heard from a friend that this year's Wanxiang Shanghai summit was exceptionally well-attended, even more so than the Hong Kong Bitcoin Conference, a feat unseen in years. The Wanxiang Blockchain team and CEO Xiao Feng have consistently driven the industry's upward spiral.

Web3 technology has already demonstrated its immense power in the financial and payment sectors, but as the infrastructure for the next generation of the internet, it hasn't yet seen widespread adoption. Even some in the industry are asking if Web3 is merely a form of Fintech. We believe that the significance of Web3 for business is the same as that of Bitcoin for finance. The underlying ledger technology changes, and both finance and commerce will change. It's just that the current Web3 user base is too small to create a network effect. Web3 is a network technology, requiring a certain user base to be reached, unlike the single-point nature of AI. AI is a productive force, explicit and readily apparent, while Web3 is a production relations technology; its emergence is subtle and gradual. Even our parents are using it, but they probably don't even know it.

Web3 applications now need to revolve around solving business problems. We've already seen projects using Web3 technology to tangibly improve the operational efficiency of internet products. While reducing costs and increasing efficiency, some entirely new business models are emerging. For example, fans and creators can be linked through Web3 technology to validate and distribute profits. Because it's commercially viable, we're very confident in the long term; it's just a matter of who gets it started first and who scales it up first.

Jing: What is BNB's role in the entire Binance ecosystem? From the outside, people know that Binance has an exchange, blockchain, Launchpad, and ecosystem projects, but they may not be clear about BNB's actual role in this whole system. What is BNB's long-term "value anchor"? Where will BNB's focus in the ecosystem shift in the next 3-5 years?

KK : BNB is the ecosystem token issued by Binance and is the core value carrier of the BNB ecosystem. If you pay close attention, you'll notice that we generally don't say "Binance ecosystem" or "Binance ecosystem," but rather "BNB ecosystem."

To understand BNB, let me give you an example. If Tesla hadn't yet listed on Nasdaq, I believe Elon Musk would have issued Tesla tokens on the blockchain. This would allow Tesla owners to use Tesla tokens for discounted charging at any charging station worldwide. Elon could use Tesla tokens to bind the interests of all ecosystem participants—shareholders, users, management, upstream and downstream partners—together. It could also be used within the ecosystems of sister companies like SpaceX. In the Web2 world, it's difficult to integrate points between companies; however, in Web3, points or tokens are naturally interconnected because they reside on a single ledger (chain). Tesla tokens can also extend beyond Musk's business empire. This is a completely new organizational model that doesn't require complete decentralization to generate significant economic growth and emotional value for users. Once Tesla stock (data) leaves the Nasdaq and DTC (Depository Trust Company) databases and becomes on-chain tokens, various businesses worldwide can exchange value with Musk's business system through Tesla tokens, achieving verifiable business trust. In contrast, Tesla stock can only bind management and shareholders at most, which is the value of Web3 technology. How can Web3 be just a Fintech? And BNB is Binance's ecosystem token, which has been running for 8 years. Our model is already there.

Hash Global defines BNB, a token backed by both value inputs and various functionalities, as a value-utility token. We anticipate a surge in the number of utility tokens over the next three years. We recently released a more academic report discussing blockchain integration and value-utility tokens. We also had Chairman Wang from Noah Holdings write a commentary on our report, which you can find online. The token IDOL from Meet48, a leading entertainment company we've invested in, is also a value-utility token, and we encourage projects to design their own ecosystem tokens in the style of BNB. For a token to have value, its underlying business model must first and foremost be healthy and sustainable.

BNB's value anchor rests on at least two aspects: 1) the various utilities within its ecosystem; and 2) the value support provided by the BNB Chain and the Binance exchange. The core of the BNB ecosystem is BNB itself. Through eight years of exploration and practice, Binance has effectively concentrated various ecosystem benefits onto BNB. The enhancement of BNB's value—the core engine of the entire ecosystem—helps attract more users, facilitates the issuance and trading of more high-quality assets, and consequently attracts more developers to build upon it. This engine is organically and efficiently bound to all parts of the BNB ecosystem. In our view, the efficiency and power of this engine are unparalleled in the industry.

Binance recently obtained its full ADGM license in Abu Dhabi. The Binance exchange already has over 300 million users globally, while the entire BNB ecosystem, including BNB Chain users, boasts over 500 million users. This is a truly unique, and currently unparalleled, internet-like financial infrastructure. CZ, He Yi, and the Binance management team have, through eight years of hard work, built a massive financial behemoth capable of providing comprehensive services to the emerging AI-powered digital economy. Its emergence will inevitably lead to various clashes with the existing financial landscape and institutions. During this process, the Binance team has also endured considerable pressure, and we have even witnessed some unfair treatment directed at individuals. We should thank the founders and team of Binance; their contribution to the development of Web3 finance and its ecosystem surpasses that of any other platform. Because this path has been arduous, filled with uncertainty and pressure that most people cannot withstand, requiring continuous innovation and sustained high levels of competitiveness, we believe its growth path will be difficult to replicate.

In the future, all assets will be issued and traded on-chain. This isn't my statement; it's from Paul Atkins, Chairman of the U.S. Securities and Exchange Commission (SEC). Let me add that this applies not only to financial assets, but also to all non-financial assets and many "assets" that currently cannot be formalized into assets. Many may not know that tokenized U.S. stocks can already be bought and sold on-chain, and over half of global trading volume occurs on the BNB Chain. BNB has effectively become the primary pricing tool for many digital assets after they are put on-chain, rather than stablecoins. And for a new stablecoin to gain liquidity and scale, it needs the support of Binance. In the next 3-5 years, in this new era of everything on-chain, we believe the BNB ecosystem will be the biggest beneficiary, with all assets seeking liquidity support from the BNB ecosystem. BNB is not only the driving force of the BNB ecosystem, but also the engine of the entire Web3 ecosystem and the digital economy.

Jing: How should families and high-net-worth clients "get on board for the first time"? In a global multi-asset portfolio, what is a reasonable position range for Web3/Crypto as a whole?

KK : I think that after an investor has allocated their portfolio to low-risk, safety-preserving assets such as insurance, gold, bonds, and real estate, they should also allocate a certain proportion to growth assets. Among growth assets, we are most optimistic about AI and Web3. Professor Zeng Ming said that the upper layer of future intelligent business is the productivity brought by AI, while the underlying large-scale coordination network can only be Web3. In my view, AI and Web3 are two sides of the same coin in the digital economy. Let me give you an example to illustrate this: Everyone is optimistic about the future explosion of agents, but the value exchange between agents cannot be through bank accounts, but only through on-chain addresses provided by Web3. We just co-hosted a machine economy forum in Dubai with Nasdaq-listed Robo.AI and the Arkreen team, a leading Web3 project in the machine economy field. The value exchange between machines can only be through Web3. Therefore, I suggest that growth assets should be allocated half to AI and half to Web3.

Jing: Looking at 2025, from the perspective of family offices and high-net-worth clients, what three main lines would you use to summarize the main investment opportunities and risks in the next 3-5 years?

KK : 1) First of all, I think that in the next 3-5 years, the risk for traditional investors is also an opportunity, which may lie in under-allocating to Web3 and over-allocating to AI. Considering the high degree of consensus in the AI industry, while the Web3 or digital asset industry has a lot of noise and low attention, I think we should subconsciously increase the allocation of digital assets.

2) Secondly, investors may overemphasize finding Alpha and neglect Beta. In reality, outperforming BTC or BNB in the Web3 industry is extremely difficult, just as outperforming Nvidia and Google in the AI industry is challenging. Therefore, investors must prioritize Beta to avoid missing out on leading assets. Look at the leading companies from the internet era; they remain leaders in the mobile internet and even the AI era. How many times have they increased in value? By holding the top positions, they fully enjoy the dividends of technological development and societal progress: Google, Microsoft, Amazon, Alibaba, Tencent, etc. The same applies to Web3. First, identify leading and core scarce assets, and once you've secured them, don't let go. These are the most valuable assets, and in my personal opinion, their importance even surpasses that of assets like real estate.

The third opportunity and risk is insufficient attention to the Web3 industry. Everyone knows what the AI industry is all about. You might be bombarded with research reports daily, and you might be intimately familiar with the financial data of Nvidia and Microsoft, but is all that operational data really important? Leave that to Noah's professional team. Instead, take some time to learn about Web3. You bought BTC, but did you truly understand it? If you had genuinely studied it, you probably wouldn't be asking me, "Is it a good time to buy BTC at this price?"

Friends who follow the Web3 industry sometimes ask me for my opinion on Coinbase and Circle stocks. I say I don't know much about them, but they should be good. I actually wonder why no one ever asks me what I think of leading digital assets like BNB. I suggest everyone spend a little time learning about digital assets like BNB. You can find YZi Labs' research report on BNB online, as well as our research report, "Valuation Methods for Value-Functional Tokens." Believe me, you might not understand the value of BTC, but if you can understand Coinbase's research reports, you can definitely understand BNB's analysis. For me, the difference between buying Coinbase stock and buying BNB is like the difference between Suning stock and Alibaba equity in 2009: one is a traditional company using internet technology, and the other is a native internet company. They are not on the same level; the native value creation model of new technologies is much more valuable. Coinbase is good, and you should invest, but since you are bullish on Web3 and digital assets, you should try to choose native Web3 assets.

Last week I spoke with two major investment firms in the Middle East, and the consensus was that BNB is the elephant in the room. Everyone must pay attention to BNB. I've worked in traditional asset management for over ten years, and I believe BNB is the most undervalued digital asset, yet also the easiest for traditional investors to understand and accept. Warren Buffett called Bitcoin a "rotten apple," but if given the opportunity, I'm confident I could persuade him to buy BNB. Don't miss out on this asset in the next decade.

I think BNB is like Moutai stock in 2004. Before 2004, only retail investors who drank Moutai knew how good it was. After 2004, institutions started to seriously study and research Moutai and began building positions. Now, only people in the Web3 industry know how good it is to hold BNB. Outsiders don't feel it and don't realize BNB's core position in the industry. I'm telling you, institutions are starting to take notice. VanEck, a large US asset management company, recently submitted a product application for a BNB ETF in the US. With such a high-growth, high-yield, monopolistic asset, you can invest and just hold it for ten or twenty years. Currently, institutional holdings of BTC are 12%, ETH is 9%, and BNB is only 0.4%. You can look at Moutai's stock price trend after 2004; it increased more than tenfold in four years. Part of the increase was due to its own performance growth, but the larger increase actually came from institutional consensus. Institutional opinions are contagious.

Jing: Within Crypto, how should BTC, ETH, BNB, and other assets be stratified and combined?

KK : We've been saying for years that BTC, ETH, and BNB are the three "one of its kind" assets in the digital asset industry. Within their respective categories, they are unique. All three assets have high long-term value; they are fundamentally different things. In terms of specific proportions, I would suggest 40% BTC, 20% ETH, and 40% BNB. I recommend a higher allocation to BNB mainly because holding it yields better ecosystem benefits, and the BNB ecosystem is developing much faster and more efficiently. I believe that in the short to medium term, BNB can completely surpass the value of ETH. Ethereum's value will only be realized after the Web3 ecosystem has fully unfolded, perhaps in ten years? By then, everyone will understand just how much we truly need a fully decentralized infrastructure. There's a question of timing, and there's uncertainty. I must state that I am very bullish on ETH, and we also hold ETH. However, in the business world, whoever can provide better services and solve real-world problems better is often more valuable.

Holding BNB currently yields over 10% in ecosystem returns, with an additional 3-4% burned annually, totaling nearly 15%. This return stems from BNB's core position in the Web3 industry, a position comparable to Nvidia in the AI industry. Everyone knows Nvidia exists in the AI industry; I'm telling you, Nvidia also exists in the Web3 industry. With a little more time spent learning, you can discover this massive beta potential for the next decade. The most appealing aspect is that traditional large institutions are only just beginning to understand it, much like the institutions that started looking at Moutai in 2004, even though those fund managers didn't drink Moutai before.

Sorumluluk Reddi: Bu sitede yeniden yayınlanan makaleler, halka açık platformlardan alınmıştır ve yalnızca bilgilendirme amaçlıdır. MEXC'nin görüşlerini yansıtmayabilir. Tüm hakları telif sahiplerine aittir. Herhangi bir içeriğin üçüncü taraf haklarını ihlal ettiğini düşünüyorsanız, kaldırılması için lütfen service@support.mexc.com ile iletişime geçin. MEXC, içeriğin doğruluğu, eksiksizliği veya güncelliği konusunda hiçbir garanti vermez ve sağlanan bilgilere dayalı olarak alınan herhangi bir eylemden sorumlu değildir. İçerik, finansal, yasal veya diğer profesyonel tavsiye niteliğinde değildir ve MEXC tarafından bir tavsiye veya onay olarak değerlendirilmemelidir.

Ayrıca Şunları da Beğenebilirsiniz

Summarize Any Stock’s Earnings Call in Seconds Using FMP API

Summarize Any Stock’s Earnings Call in Seconds Using FMP API

Turn lengthy earnings call transcripts into one-page insights using the Financial Modeling Prep APIPhoto by Bich Tran Earnings calls are packed with insights. They tell you how a company performed, what management expects in the future, and what analysts are worried about. The challenge is that these transcripts often stretch across dozens of pages, making it tough to separate the key takeaways from the noise. With the right tools, you don’t need to spend hours reading every line. By combining the Financial Modeling Prep (FMP) API with Groq’s lightning-fast LLMs, you can transform any earnings call into a concise summary in seconds. The FMP API provides reliable access to complete transcripts, while Groq handles the heavy lifting of distilling them into clear, actionable highlights. In this article, we’ll build a Python workflow that brings these two together. You’ll see how to fetch transcripts for any stock, prepare the text, and instantly generate a one-page summary. Whether you’re tracking Apple, NVIDIA, or your favorite growth stock, the process works the same — fast, accurate, and ready whenever you are. Fetching Earnings Transcripts with FMP API The first step is to pull the raw transcript data. FMP makes this simple with dedicated endpoints for earnings calls. If you want the latest transcripts across the market, you can use the stable endpoint /stable/earning-call-transcript-latest. For a specific stock, the v3 endpoint lets you request transcripts by symbol, quarter, and year using the pattern: https://financialmodelingprep.com/api/v3/earning_call_transcript/{symbol}?quarter={q}&year={y}&apikey=YOUR_API_KEY here’s how you can fetch NVIDIA’s transcript for a given quarter: import requestsAPI_KEY = "your_api_key"symbol = "NVDA"quarter = 2year = 2024url = f"https://financialmodelingprep.com/api/v3/earning_call_transcript/{symbol}?quarter={quarter}&year={year}&apikey={API_KEY}"response = requests.get(url)data = response.json()# Inspect the keysprint(data.keys())# Access transcript contentif "content" in data[0]: transcript_text = data[0]["content"] print(transcript_text[:500]) # preview first 500 characters The response typically includes details like the company symbol, quarter, year, and the full transcript text. If you aren’t sure which quarter to query, the “latest transcripts” endpoint is the quickest way to always stay up to date. Cleaning and Preparing Transcript Data Raw transcripts from the API often include long paragraphs, speaker tags, and formatting artifacts. Before sending them to an LLM, it helps to organize the text into a cleaner structure. Most transcripts follow a pattern: prepared remarks from executives first, followed by a Q&A session with analysts. Separating these sections gives better control when prompting the model. In Python, you can parse the transcript and strip out unnecessary characters. A simple way is to split by markers such as “Operator” or “Question-and-Answer.” Once separated, you can create two blocks — Prepared Remarks and Q&A — that will later be summarized independently. This ensures the model handles each section within context and avoids missing important details. Here’s a small example of how you might start preparing the data: import re# Example: using the transcript_text we fetched earliertext = transcript_text# Remove extra spaces and line breaksclean_text = re.sub(r'\s+', ' ', text).strip()# Split sections (this is a heuristic; real-world transcripts vary slightly)if "Question-and-Answer" in clean_text: prepared, qna = clean_text.split("Question-and-Answer", 1)else: prepared, qna = clean_text, ""print("Prepared Remarks Preview:\n", prepared[:500])print("\nQ&A Preview:\n", qna[:500]) With the transcript cleaned and divided, you’re ready to feed it into Groq’s LLM. Chunking may be necessary if the text is very long. A good approach is to break it into segments of a few thousand tokens, summarize each part, and then merge the summaries in a final pass. Summarizing with Groq LLM Now that the transcript is clean and split into Prepared Remarks and Q&A, we’ll use Groq to generate a crisp one-pager. The idea is simple: summarize each section separately (for focus and accuracy), then synthesize a final brief. Prompt design (concise and factual) Use a short, repeatable template that pushes for neutral, investor-ready language: You are an equity research analyst. Summarize the following earnings call sectionfor {symbol} ({quarter} {year}). Be factual and concise.Return:1) TL;DR (3–5 bullets)2) Results vs. guidance (what improved/worsened)3) Forward outlook (specific statements)4) Risks / watch-outs5) Q&A takeaways (if present)Text:<<<{section_text}>>> Python: calling Groq and getting a clean summary Groq provides an OpenAI-compatible API. Set your GROQ_API_KEY and pick a fast, high-quality model (e.g., a Llama-3.1 70B variant). We’ll write a helper to summarize any text block, then run it for both sections and merge. import osimport textwrapimport requestsGROQ_API_KEY = os.environ.get("GROQ_API_KEY") or "your_groq_api_key"GROQ_BASE_URL = "https://api.groq.com/openai/v1" # OpenAI-compatibleMODEL = "llama-3.1-70b" # choose your preferred Groq modeldef call_groq(prompt, temperature=0.2, max_tokens=1200): url = f"{GROQ_BASE_URL}/chat/completions" headers = { "Authorization": f"Bearer {GROQ_API_KEY}", "Content-Type": "application/json", } payload = { "model": MODEL, "messages": [ {"role": "system", "content": "You are a precise, neutral equity research analyst."}, {"role": "user", "content": prompt}, ], "temperature": temperature, "max_tokens": max_tokens, } r = requests.post(url, headers=headers, json=payload, timeout=60) r.raise_for_status() return r.json()["choices"][0]["message"]["content"].strip()def build_prompt(section_text, symbol, quarter, year): template = """ You are an equity research analyst. Summarize the following earnings call section for {symbol} ({quarter} {year}). Be factual and concise. Return: 1) TL;DR (3–5 bullets) 2) Results vs. guidance (what improved/worsened) 3) Forward outlook (specific statements) 4) Risks / watch-outs 5) Q&A takeaways (if present) Text: <<< {section_text} >>> """ return textwrap.dedent(template).format( symbol=symbol, quarter=quarter, year=year, section_text=section_text )def summarize_section(section_text, symbol="NVDA", quarter="Q2", year="2024"): if not section_text or section_text.strip() == "": return "(No content found for this section.)" prompt = build_prompt(section_text, symbol, quarter, year) return call_groq(prompt)# Example usage with the cleaned splits from Section 3prepared_summary = summarize_section(prepared, symbol="NVDA", quarter="Q2", year="2024")qna_summary = summarize_section(qna, symbol="NVDA", quarter="Q2", year="2024")final_one_pager = f"""# {symbol} Earnings One-Pager — {quarter} {year}## Prepared Remarks — Key Points{prepared_summary}## Q&A Highlights{qna_summary}""".strip()print(final_one_pager[:1200]) # preview Tips that keep quality high: Keep temperature low (≈0.2) for factual tone. If a section is extremely long, chunk at ~5–8k tokens, summarize each chunk with the same prompt, then ask the model to merge chunk summaries into one section summary before producing the final one-pager. If you also fetched headline numbers (EPS/revenue, guidance) earlier, prepend them to the prompt as brief context to help the model anchor on the right outcomes. Building the End-to-End Pipeline At this point, we have all the building blocks: the FMP API to fetch transcripts, a cleaning step to structure the data, and Groq LLM to generate concise summaries. The final step is to connect everything into a single workflow that can take any ticker and return a one-page earnings call summary. The flow looks like this: Input a stock ticker (for example, NVDA). Use FMP to fetch the latest transcript. Clean and split the text into Prepared Remarks and Q&A. Send each section to Groq for summarization. Merge the outputs into a neatly formatted earnings one-pager. Here’s how it comes together in Python: def summarize_earnings_call(symbol, quarter, year, api_key, groq_key): # Step 1: Fetch transcript from FMP url = f"https://financialmodelingprep.com/api/v3/earning_call_transcript/{symbol}?quarter={quarter}&year={year}&apikey={api_key}" resp = requests.get(url) resp.raise_for_status() data = resp.json() if not data or "content" not in data[0]: return f"No transcript found for {symbol} {quarter} {year}" text = data[0]["content"] # Step 2: Clean and split clean_text = re.sub(r'\s+', ' ', text).strip() if "Question-and-Answer" in clean_text: prepared, qna = clean_text.split("Question-and-Answer", 1) else: prepared, qna = clean_text, "" # Step 3: Summarize with Groq prepared_summary = summarize_section(prepared, symbol, quarter, year) qna_summary = summarize_section(qna, symbol, quarter, year) # Step 4: Merge into final one-pager return f"""# {symbol} Earnings One-Pager — {quarter} {year}## Prepared Remarks{prepared_summary}## Q&A Highlights{qna_summary}""".strip()# Example runprint(summarize_earnings_call("NVDA", 2, 2024, API_KEY, GROQ_API_KEY)) With this setup, generating a summary becomes as simple as calling one function with a ticker and date. You can run it inside a notebook, integrate it into a research workflow, or even schedule it to trigger after each new earnings release. Free Stock Market API and Financial Statements API... Conclusion Earnings calls no longer need to feel overwhelming. With the Financial Modeling Prep API, you can instantly access any company’s transcript, and with Groq LLM, you can turn that raw text into a sharp, actionable summary in seconds. This pipeline saves hours of reading and ensures you never miss the key results, guidance, or risks hidden in lengthy remarks. Whether you track tech giants like NVIDIA or smaller growth stocks, the process is the same — fast, reliable, and powered by the flexibility of FMP’s data. Summarize Any Stock’s Earnings Call in Seconds Using FMP API was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story
Paylaş
Medium2025/09/18 14:40