[People at MARU] Yeonseok Kim, CEO of ZETIC.ai: "We provide a one-stop on-device AI transition solution for AI companies"

Tech42

2024. 12. 19.

  • Over six years at Qualcomm AI Research developing on-device AI solutions, including an AI-based speech recognition module.

  • Introduced ZETIC.MLange, a solution to transition GPU cloud-based AI services to on-device AI seamlessly.

  • Solved NPU compatibility issues, enabling AI services to operate across any OS, processor, or device.

In June, Apple introduced "Apple Intelligence," applying on-device AI to its iPhones. Prior to this, Samsung had successfully incorporated on-device AI functionalities into its Galaxy S24, based on Android.

Until recently, the focus in AI technology revolved around massive investments in large language models (LLMs) and generative AI. However, since this year, criticisms have arisen about the lack of clear revenue models relative to the investment costs, leading to debates over a potential AI bubble. A significant issue has been the high costs associated with building and maintaining GPU (Graphics Processing Unit) infrastructure for AI development.

As an alternative, on-device AI utilizing NPUs (Neural Network Processing Units, commonly referred to as AI chips) has emerged. On-device AI processes computations locally without sending data to the cloud, eliminating the need for an internet connection, enhancing computational speed, and providing stronger privacy protections. Most importantly, it dramatically reduces costs by eliminating the reliance on servers.

As NPUs continue to be integrated into various devices, such as smartphones and laptops, it is expected that their applications will expand. Reflecting this trend, more companies in the AI industry are now actively developing services based on on-device AI. However, transitioning from traditional GPU-based cloud AI services to on-device AI presents several significant challenges.

One major obstacle is the low compatibility of NPUs, which are foundational to on-device AI. Major NPU developers, including MediaTek, Apple, Qualcomm, and Samsung, employ different frameworks for AI service development. This requires optimizing applications individually for each NPU. Furthermore, the rapidly evolving market has created a shortage of experts capable of addressing these challenges.

Amid this environment, a startup has emerged, providing solutions for transitioning mobile AI services to on-device platforms, capturing attention. The startup, founded by former Qualcomm engineer Yeonseok Kim, is none other than ZETIC.ai.

A Startup Founded by an On-Device AI Expert with Over Six Years at Qualcomm

Founded in March 2024, ZETIC.ai launched the beta version of its integrated on-device AI solution, ZETIC.MLange, just four months later in July. ZETIC.MLange is an automated platform that converts existing AI models—such as STT (speech-to-text), TTS (text-to-speech), segmentation, and generative AI—into optimized libraries for on-device AI.

When AI service companies adopt ZETIC.MLange, they can internalize computations on devices, significantly reducing server costs. Additionally, it allows developers to create AI applications compatible with various operating systems and processors, regardless of the device. Even in its beta version, ZETIC.MLange supports Android, iOS, Linux, and NPUs from leading chipmakers like MediaTek, Qualcomm, and Apple. According to ZETIC.ai, companies with pre-existing AI models can deploy fully functional on-device AI applications to any device within 24 hours without additional engineering work.

The rapid development and release of these solutions were made possible by CEO Yeonseok Kim's expertise, honed during his time at Qualcomm AI Research Lab from 2017 until the company’s founding. At Qualcomm, Kim specialized in NPU and on-device AI development, creating groundbreaking tools like the Qualcomm On-Device AI Toolkit and NPU-specific AI frameworks. He also contributed to the development of speech recognition modules embedded in Qualcomm chips, which are used in AI speakers by companies like Amazon.

Speaking from ZETIC.ai’s new office at MARU180, an entrepreneur platform by the Asan Nanum Foundation, Kim reflected on his time at Qualcomm with humility: "I was lucky."

“When I joined Qualcomm, there weren’t many people specializing in AI. I started as a software engineer, approaching AI from a software perspective to create frameworks. At the time, on-device AI, then called embedded AI, didn’t attract much attention in an industry focused on large-scale GPU-based training. But I anticipated a market for on-device AI. I thought it would happen next year, but it arrived earlier than expected—largely because of the costs embedded in AI technology. The greatest advantages of on-device AI come from eliminating the need for servers.”

As the market expanded, Kim envisioned overcoming the limitations of developing NPU-specific on-device AI for Qualcomm by creating a universal AI framework applicable across all NPUs. His motivation to found ZETIC.ai was also influenced by the absence of a dominant player in the mobile NPU market, unlike NVIDIA's dominance in the GPU sector.

“In the global mobile AI chip market, MediaTek holds 40%, Qualcomm and Apple each control 20%, and Samsung’s Exynos accounts for about 6%. Despite Qualcomm’s success, the pie was already divided. I realized we needed innovative AI software to break through these market constraints, which led to the creation of ZETIC.ai.”

Revolutionizing the SoC Landscape with an On-Device AI Solution

As previously mentioned, ZETIC.MLange is not only optimized for Qualcomm NPUs commonly found in Android devices but also supports Apple’s A-series and M-series chips, maximizing NPU performance. AI companies simply need to input their models and data into ZETIC.MLange. The platform automatically optimizes and converts the models into software libraries tailored for NPUs in Android and iOS devices. This eliminates the server-based processing that traditionally required GPU cloud support, effectively reducing server costs to zero.

Kim emphasized the advantages of this approach, particularly for privacy-sensitive applications like healthcare:
“Not relying on servers is a significant benefit when developing AI applications that handle sensitive personal data. Traditional applications send entire datasets to company servers for processing and then return the service to users. This process incurs not only high costs but also potential privacy risks. Many companies struggle with managing this data, creating opportunities for breaches. On-device AI eliminates these risks entirely.”

With its ability to support all NPUs, ZETIC.MLange has the potential to disrupt the fragmented SoC (System on Chip) ecosystem, currently dominated by proprietary AI frameworks. Kim explained ZETIC.ai’s independent approach:
“When companies enter the on-device AI market, they face challenges from hardware selection. If issues arise with a specific hardware manufacturer, deploying new services on devices using their NPUs becomes complicated. We aim to maintain independence from NPU manufacturers, focusing instead on leveraging existing hardware to deliver AI technology. This is why we are currently concentrating on mobile on-device AI, enabling everyone to integrate AI into the smartphones they already own.”

ZETIC.MLange, even in its beta phase, supports 80% of mobile NPUs worldwide. The goal is to expand support to the remaining 20%, ultimately enabling on-device AI services across all NPU-equipped devices. Looking ahead, Kim shared his vision of a collaborative ecosystem:
“We are building test cases and improving our platform’s completeness ahead of the official launch early next year. Successful AI software tends to attract hardware manufacturers, creating a position where they want to collaborate. By firmly establishing ourselves as a leader in AI software, we can encourage manufacturers to work with us to build a shared ecosystem. In doing so, ZETIC.ai can become the standard for AI services worldwide.”

Kim acknowledged some limitations of on-device AI, such as constrained performance compared to cloud-based servers and increased battery consumption due to local processing. Nonetheless, he argued that the market for on-device AI services remains substantial:
“Generative AI and LLM models have limitations in running on-device. These models function like search engines, requiring cloud servers to deliver accurate and detailed results. However, the scope for on-device AI lies beyond these LLM tasks. Many AI tasks across industries already operate efficiently on-device. It’s simply a matter of task specificity.”

Building on its unique technology and competitive edge, ZETIC.ai is focusing on establishing case studies in Korea before rapidly expanding internationally. The company has already secured contracts with domestic and global clients, with Kim noting, “2025 will be a critical year for us.”

“While I can’t disclose specifics due to NDAs, we’ve signed contracts with several Korean companies and received positive responses from India and the U.S. Some of our clients currently offer services exclusively on iOS and are now exploring Android integration. Next year, we plan to shift much of our business focus to the U.S., as building case studies there is essential for global expansion. Many regions, particularly those lacking server infrastructure, represent untapped opportunities for on-device AI.”

Aiming for AI Democratization

When asked about ZETIC.ai’s long-term goals beyond the official launch and global expansion, Kim outlined a vision for making AI universally accessible:
“Currently, AI is controlled by companies, but our goal is to bring AI directly to everyone. We aim to open a new era of AI. While we are focused on mobile for now, we plan to expand our support to other hardware platforms in the future. Becoming the standard in the mobile industry is just the first step.”

Kim concluded with a smile, “Eventually, we hope to provide AI models that people can use anytime and anywhere, exactly when they need them.”

© 2024 ZETIC.ai All rights reserved.

© 2024 ZETIC.ai All rights reserved.


© 2024 ZETIC.ai All rights reserved.