The Linux packager’s SUSE AI Early Access Program could interest companies wanting to run generative AI on premises. Credit: Shutterstock SUSE is preparing an “enterprise-grade generative AI Platform” that will run any vendor’s large language models (LLMs) on premises or in the cloud, it said Tuesday. But analysts said it is the on-prem scenario where SUSE could make a big difference, given the virtual absence of major vendors positioning for on-prem. Bill Weinberg, senior partner at consulting firm OpenSourceSense, said the dearth of on-prem AI suites is noteworthy. “I haven’t seen a lot of integrated AI offerings even talking about on-prem. There was an announcement last year by IBM and VMware regarding support for Watson AI on prem, but with the current trajectory of VMware, it’s hard to say how consequential that solution is today,” Weinberg said. “Today’s announcement is strongly positioned to substantiate the company as a supplier to enterprise of open source AI technology, without making specific assumptions of where the enterprise market really stands regarding on-prem AI. They are hedging their bets.” SUSE plans to “offer enterprises a modular, secure, vendor- and LLM-agnostic gen-AI platform that will dissolve silos and reduce costs associated with enterprise generative AI implementations. These AI solutions, built on SUSE’s industry-leading open source, enterprise-grade SUSE Linux, Rancher Prime Kubernetes management and Rancher NeuVector security offerings, will enable enterprises to control data flows in a secure, private environment, reducing regulatory compliance risk and improving security,” it said in a statement. Gartner Research VP Tony Iams said that SUSE not talking about an alternative for VMWare was “maybe a miss in our view.” The consequences of Broadcom’s acquisition of VMware are a concern for customers, he said: “The VMWare alternative issue is a pressing, real problem.” Both analysts saw SUSE’s move as a response to many enterprises struggling to craft an effective gen-AI deployment strategy, one that balances cybersecurity, compliance, privacy, scalability, data leakage, shadow AI, accuracy (aka hallucinations) and cost-effectiveness. Once IT management figures out whether to run workloads in the cloud or on premises, then they can explore the question of open-source versus proprietary operating systems. Cost trade-offs Regarding where generative AI workloads are run, on premises or in the cloud, “there are some cost considerations. The jury is out on the cost tradeoffs,” Iams said. For many enterprises, the on-prem vs cloud debate is more about control than anything else. It is a common problem for CIOs and CISOs to work out precise settings and customizations tailored to that enterprise’s environment, only to find those decisions overwritten by a cloud staffer who changed settings universally for all cloud tenants. “The universal business model is that the CIO wants throats to choke,” Weinberg said, referring to the ability to control employees and contractors that your team has hired, versus an employee or contractor working for the cloud vendor. As for the software, Iams said that “open source is not always going to be cheaper than closed source. There is this perception that open source is cheap, but someone has to get all of it to work together.” That is precisely part of the SUSE argument, that they will be delivering a suite of all of the elements needed to support gen-AI deployments, with all elements tested to work well together. “SUSE approaches AI with a strong foundation in open source principles, a commitment to delivering security, and a belief that customer options, including privacy by design, is paramount,” the vendor’s statement said. “SUSE AI takes a responsible AI approach by which enterprises are empowered to choose the models and tools they prefer to get the most out of AI in a private, safe and secure environment.” Weinberg also pointed to another IT fear, which is what happens when a gen-AI vendor either goes out of business or perhaps simply abandons that product line. He compared an open source AI strategy to traditional code escrow, referring to open source as “the escrow of last resort. If the vendor tanks, you have the source code licenses in an amenably open-sourced forum.” Related content how-to How to examine files on Linux Linux provides very useful options for viewing file attributes, such as owners and permissions, as well as file content. By Sandra Henry Stocker Oct 24, 2024 6 mins Linux how-to 8 easy ways to reuse commands on Linux Typing the same command again and again can become tiresome. Here are a number of ways you can make repeating commands – or repeating commands but with some changes – a lot easier than you might expect. By Sandra Henry-Stocker Oct 15, 2024 5 mins Linux news SUSE Edge upgrade targets Kubernetes and Linux at the edge SUSE Edge 3.1 includes a new stack validation framework and an image builder tool that are aimed at improving the scalability and manageability of complex Kubernetes and Linux edge-computing deployments. By Sean Michael Kerner Oct 15, 2024 6 mins Edge Computing Linux Network Management Software how-to Lesser-known xargs command is a versatile time saver Boost your Linux command line options and simplify your work with xargs, a handy tool for a number of data manipulation tasks. By Sandra Henry Stocker Oct 11, 2024 6 mins Linux PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe