[ComeUp Stars] Use Any Company’s NPU with Just Two Lines of Code – ZETIC.ai

Byline Network

Dec 18, 2024

“Many AI companies face challenges in profitability due to the enormous costs of GPU cloud servers. In this context, On-device AI is emerging as a compelling alternative to cloud-based AI.”

Kim Yeonseok, CEO of ZETIC.ai, emphasized the growing adoption of On-device AI during the “ComeUp Stars” pitching stage at the recent startup festival ComeUp 2024. ZETIC.ai develops universal software that makes it easy for AI companies to implement On-device AI. By using ZETIC.ai’s software framework, companies can utilize Neural Processing Units (NPUs) in smartphones to run AI on-device, significantly reducing costs compared to GPUs. Kim highlighted this as a key competitive edge of ZETIC.ai.

On-device AI refers to technology that enables AI services to run directly on devices, rather than relying on external servers or the cloud. Instead of using external computing resources, ZETIC.ai leverages NPUs built into smartphones. These NPUs are up to 60 times faster than GPUs and consume less power. This allows companies to save on GPU and CPU costs while achieving faster processing speeds—one of the primary reasons why many AI companies are seeking On-device AI solutions.

The rise of On-device AI has been made possible by mobile chip manufacturers such as Qualcomm and Apple, which began integrating NPUs into smartphones in the late 2010s. However, software that universally supports diverse mobile NPUs is still lacking. It is also impractical for individual AI companies to form partnerships with each hardware manufacturer. ZETIC.ai addresses this gap by offering software that supports multiple NPU environments at once. With just a few lines of code, companies can deploy AI services across a range of devices, including those powered by Apple, Qualcomm, and more.

ZETIC.ai has focused on tackling the high infrastructure costs and technical challenges that hinder companies from adopting On-device AI. In an interview with Byline Network on November 16, CEO Kim Yeonseok shared insights about ZETIC.ai’s services, target markets, and future vision.

Q: Please introduce yourself.
I began working in On-device AI during its early days, back when it wasn’t even called that. After the “AlphaGo shock” sparked global interest in AI, I focused on embedded AI in the mobile sector at Qualcomm. This experience led me to establish ZETIC.ai.

Q: On-device AI is a popular term these days. What exactly does it mean?
It refers to bringing AI computation from centralized servers to devices around us. Essentially, it’s the idea of running AI locally on our everyday devices to make it more accessible in our lives.

Q: How is it possible to compute AI without a server?
Three things are needed: powerful hardware to replace GPUs, optimization to make this hardware efficient, and software to run it all. The most powerful devices around us are smartphones, which started integrating NPUs for AI tasks around 2018. Many people are already carrying devices with these capabilities. Our software framework harnesses this potential to enable On-device AI.

Q: What role does your software framework play in enabling On-device AI?
NVIDIA dominates the GPU market, making it straightforward for companies to run AI services on GPUs. But smartphones are a fragmented ecosystem with varying manufacturers and hardware. ZETIC.ai’s software unifies this landscape, allowing AI services to be deployed across diverse devices with ease.

Q: Why is On-device AI gaining attention now?
Operating and maintaining GPU infrastructure is costly, placing a financial burden on AI companies. On-device AI helps cut costs significantly. Moreover, AI services are increasingly accessed via smartphones, making it practical to leverage their computing power. This approach not only saves costs but also enhances user privacy, improves service speed, and ensures functionality in offline environments.

Q: What services does ZETIC.ai provide?
We enable AI services to function as On-device AI, similar to how cloud-based AI services are created. Our beta version is currently available.

Q: Is there demand for On-device AI among companies?
Yes, many companies want to reduce their reliance on GPUs due to high cloud costs. On-device AI also helps safeguard user data by eliminating the need to send it to servers, which appeals to AI companies focused on privacy.

Q: What is ZETIC.ai’s core technology?
Our core capability is enabling AI services to utilize NPUs. Since phone manufacturers like Qualcomm, MediaTek, and Apple each have different requirements, our software simplifies deployment across these platforms. With just two lines of code, companies can support various NPUs, much like how apps are developed for multiple devices.

Q: How long did it take to develop this technology?
It took two engineers six months, starting in April of this year. Rather than inventing entirely new technology, we streamlined existing complexities, creating a highway where none existed before. We’ve conducted proof-of-concept projects with clients over the past couple of months.

Q: Can you share customer success stories?
Our customers typically fall into three categories:

  1. Companies that rely on servers due to a lack of knowledge about On-device AI.

  2. Those offering services on specific hardware only (e.g., Apple devices).

  3. Companies running On-device AI but struggling with low performance.

For example, one client improved video processing speeds from 50–60 frames per second to over 180 frames per second using our solution. Another client expanded from Apple-only services to Android as well. Lastly, an Indian healthcare provider drastically reduced server costs and resolved regulatory issues around medical data by adopting our solution.

Q: How much cost reduction can companies expect by switching to NPUs?
If annual GPU costs run into tens of thousands of dollars, using NPUs can reduce that to zero. Our revenue model involves a small licensing fee per user for deployed services, such as $0.10 per app user.

Q: Is there strong demand for On-device AI in Korea?
Yes, many companies still rely on servers, but globally, demand for On-device AI is growing as companies look to improve cost structures in a challenging economy. On-device AI offers a compelling alternative to costly AI infrastructure.

Q: What differentiates ZETIC.ai in the On-device AI market?
Our strength lies in enabling NPU usage, which is up to 60 times faster than CPUs or GPUs. We also support multiple operating systems and hardware platforms seamlessly.

Q: How many customers have you secured?
We’ve been selling our product for two months, with one domestic client and ongoing discussions with international companies.

Q: Which markets are you targeting for global expansion?
While we’re focusing on the U.S., we’ve also received significant interest from India. We plan to actively pursue U.S. clients early next year.

Q: Why focus on mobile rather than expanding into other industries?
I believe the most immediate impact of AI is in smartphones—the devices in everyone’s hands. Limiting AI services to specific operating systems or environments is fundamentally wrong, which is why we developed a framework for universal deployment. We don’t create AI models or venture into other sectors because we’re committed to mobile.

Q: What’s ZETIC.ai’s vision?
Our vision is clear: to bring global On-device AI into everyone’s hands quickly, making AI as close and accessible as a pocket or a palm. We aim to become the standard for AI software worldwide.

Q: Have you secured funding?
We raised seed funding in April and plan to launch our next round early next year.

Q: Congratulations on being named one of the top 10 innovative startups at ComeUp 2024. How do you feel?
We’re thrilled with the recognition. It’s validation of our hard work and gives us confidence that we’re creating something impactful. It’s also a great motivator for future growth.

© 2024 ZETIC.ai All rights reserved.

© 2024 ZETIC.ai All rights reserved.


© 2024 ZETIC.ai All rights reserved.