Tiiny AI Pocket Lab brings private, offline AI computing — including 120B-parameter models — to individuals
LAS VEGAS, Jan. 8, 2026 /PRNewswire/ — Tiiny AI has launched the Tiiny AI Pocket Lab, a pocket-size personal AI computer designed to run large-scale artificial intelligence models locally, as concerns over data privacy, rising usage costs, and cloud dependence increasingly shape how individuals and businesses adopt AI.
The product was introduced earlier this week at CES, where it drew attention from international media, industry analysts, and developers evaluating alternatives to cloud-based AI services. While most mainstream AI platforms rely on remote data centers and usage-based pricing, Tiiny AI Pocket Lab enables users to operate AI entirely on-device — without subscriptions, token fees, or continuous internet connectivity.
At CES, Tiiny AI demonstrated the Pocket Lab running large language models with up to 120 billion parameters fully offline, achieving real-world decoding speeds 20+ tokens/s. The company positions this performance as sufficient for daily, practical use — rather than a technical showcase or proof-of-concept.
As generative AI usage expands, companies and individual users are reassessing trade-offs between convenience and control. Cloud-based AI services offer scale but often require ongoing fees and the transmission of sensitive data to third-party servers. Local AI systems, once considered impractical due to hardware limitations, are now gaining renewed interest as advances in inference efficiency make large models viable on compact devices.
“People are starting to ask where their data goes and how much AI really costs over time,” said Samar, GTM director of Tiiny AI. “We believe personal AI should feel more like owning a computer than renting intelligence by the token.”
During CES, Tiiny AI positioned Pocket Lab as a dedicated personal AI engine designed to work alongside existing laptops and desktops, rather than replace them. The device connects via plug-and-play and offloads large-model inference externally, allowing even older computers to access advanced AI capabilities without hardware upgrades.
Alongside the hardware, Tiiny AI also introduced TiinyOS, its on-device software platform. TiinyOS includes a consumer-facing client that allows users to download and run open-source large language models and AI agents with a single click, as well as developer tools for building and deploying local AI workflows without reliance on cloud infrastructure.
Tiiny AI Pocket Lab is scheduled to launch on Kickstarter in February, with a super early-bird price of $1,399. According to the company, the pricing is designed to lower the barrier to local AI adoption rather than position the device as a premium workstation. Pocket Lab includes 80GB of LPDDR5X memory, a configuration whose standalone market value can exceed $900, reflecting current memory pricing and supply conditions.
Industry observers note that devices in this price range have traditionally struggled to support very large models locally. By contrast, Pocket Lab is designed to run 100B-plus parameter large language models on-device, placing it among a small group of emerging systems that aim to bring high-capacity AI inference to personal, offline hardware. The growing interest in such local-first designs mirrors broader discussions around data governance, enterprise security, and the long-term economics of AI services.
In December 2025, Tiiny AI received confirmation from Guinness World Records that the Pocket Lab will be certified as “The smallest mini PC capable of running a 100B-parameter large language model locally.”
The company plans to continue engaging developers and early adopters as it expands access to Pocket Lab following its CES debut.
View original content to download multimedia:https://www.prnewswire.com/news-releases/tiiny-ai-launches-pocket-size-personal-ai-computer-drawing-strong-media-attention-at-ces-302656261.html
SOURCE Tiiny AI


