LLMWare Announces Collaboration With Intel for Deploying Gen AI on AI PCs
GREENWICH, Conn., October 15, 2024 (Newswire.com) - LLMWare.ai, a pioneer in deploying and fine-tuning Small Language Models (SLMs), announced today a strategic collaboration with Intel to offer select enterprise customers a limited release of Model HQ in private preview. Model HQ is a new streamlined software package that provides a fast and easy on-ramp to working with SLMs locally on Intel AI PCs while providing comprehensive enterprise-ready capabilities.
AI PCs with Intel Core Ultra processors are optimized with powerful integrated GPUs and NPUs that provide the hardware capability to enable AI apps to be deployed on-device. This allows enterprises to deploy lightweight AI apps locally without exposing sensitive data or necessitating data copies in external systems while benefitting from added security, safety and significant cost-savings.
Specifically designed to maximize model performance for AI PCs with Intel Core Ultra processors, Model HQ provides an out-of-the-box kit for running, creating and deploying AI-enabled apps with integrated UI/UX and low-code agent workflow for easy app creation. With built-in Chatbot and Document Search and Analysis features using the latest models including Microsoft Phi-3, Llama, Yi and Qwen as well as LLMWare’s specialized function calling SLIM models designed for multi-step workflows, Model HQ delivers powerful optimization techniques for the AI PC.
The Model HQ app comes ready to use, and provides the ability to launch custom workflows directly on user device. Model HQ is also packed with many enterprise-ready security and safety features such as Model Vault for model security checks and storage, Model Safety Monitor for toxicity and bias screening, hallucination detector, AI Explainability logs, Compliance and Auditing Toolkit, privacy filters and much more.
“At LLMWare, we believe that AI PCs with Intel processors will be transformational in ‘lowering the center of gravity’ for AI and enabling distributed deployment of safe and accurate models as productivity tools that can be integrated cost-effectively and privately into enterprise business processes. Business users need a compelling, ease-to-use experience; IT teams need to ensure safety, security and manageability, while enterprise AI developers want the ability to customize and rapidly create ‘lightweight’ AI apps. With Model HQ and Intel’s AI PCs, we are aspiring to provide a solution that uniquely meets the needs of each of these stakeholders,” said Darren Oberst, Co-Founder and CTO of LLMWare.
“The demand for AI enabled client applications is booming. We are happy to expand our AI enabling efforts to reach even more corporate application developers by collaborating with LLMWare’s packaged tools and models optimized for Intel Core Ultra processors. Corporate developers have different needs for simplicity, built-in security, and multi-step AI workflows. We encourage our commercial developer ecosystem to learn more and sign up for the private preview,” said Carla Rodriguez, Intel Vice President and General Manager of Client Software Ecosystem Enabling.
LLMWare’s Model HQ app will also be featured in an upcoming joint webinar with Intel on November 14, 2024. Entitled “Securing AI Workloads on Client: A Financial Analyst Use Case,” the webinar will feature technical, hands-on view of Intel Core Ultra’s AI capabilities and how it can help keep business workloads secure.
About LLMWare.ai
LLMWare.ai provides an end-to-end solution for running, deploying and creating AI-based applications using Small Language Models for the enterprise. Selected by Github as a leading open source technology shaping the future of AI in 2024, LLMWare is a pioneer in deploying and fine-tuning Small Language Models particularly for use in highly-regulated or data sensitive industries. With 100+ models in Hugging Face and 100,000’s of users, LLMWare is a leader in developing cutting-edge AI app deployment solutions. For additional information, including product, blogs and latest research reports, please visit llmware.ai.
For Media Inquiries, please contact:
Source: LLMWare.ai