Americas

  • United States
by Sean Michael Kerner

Selector AI debuts ‘network language model’ for netops

News
Sep 23, 20245 mins
Network Management Software

Selector AI’s network-specific language model is a fine-tuned version of LLama optimized to understand networking concerns.

datacenter, data, AI
Credit: Shutterstock

Thanks to the popularity and widespread success of ChatGPT, most IT users are familiar with the concept of a large language model (LLM). But how does an LLM apply to network operations?

Network and infrastructure operations startup Selector AI today announced its Network Language Model (NLM) technology, which builds on top of an LLM to help optimize network operations. The basic idea is to add another layer of intelligence to all the various network logs and monitoring data that an organization has in place.

In addition to the NLM debut, Selector AI is enhancing its digital twin capabilities, enabling network operators to model an accurate virtual representation of a production network. The company is also adding new programmable synthetic sensors to its platform that enable users to generate synthetic network traffic for testing and validation.

“For a long time, network teams have basically relied on all sorts of different monitoring and observability tools in order to understand what’s happening in the network,” Kevin Kamel, vice president of product management at Selector AI, told Network World. “The area that we focus on at Selector is basically applying a combination of machine learning and AI to all of the telemetry that’s been gathered by these disparate tools within the environment.”

Bridging the language gap with NLM

The Selector AI platform is designed to correlate network and infrastructure telemetry data, performing root cause analysis for issues and helping administrators to better understand network operations.

At the heart of Selector AI’s latest release is the integration of a network language model, built upon the foundation of Meta’s popular Llama 3 LLM.

“What we’ve done is basically take that Llama 3 model, and we’ve now trained it with an enormous corpus of networking telemetry and insights, and really the domain experience that we have at the company,” Kamel explained.

This network-specific language model allows Selector’s platform to understand network terminology and concepts with a high-degree of precision. Kamel said that the model understands network connectivity and all that it entails, including interface descriptions. Users can ask questions about a specific interface an organization is using, for example.

The network language model’s capabilities also extend beyond natural language queries and can tackle complex challenges faced by network teams. For example, Kamel said that Selector is working with a large data center provider that receives information on maintenance windows from network peers around the world. The maintenance window information was coming in different languages, including Japanese, Hindi and French, that weren’t always easily understood.

“What we’re doing with the NLM is we’re basically able to read all of those maintenance windows in real time and then actually use those in order to suppress alerts and then inform people in advance of when certain links will be down globally,” he said.

Unlocking the Power of Digital Twins 

Alongside the network language model, Selector AI has introduced a digital twin capability, allowing customers to create a virtual representation of their network. 

Kamel said that Selector is able to model an organization’s network, using low-level telemetry in real time and building an in-memory model of what routing and traffic looks like across the entire network.

This digital twin feature enables a range of use cases, from historical troubleshooting to capacity planning and “what-if” analysis. Kamel noted that the digital twin capability can also be used to more accurately handle capacity planning.

“Today, a lot of these network operators will basically lick their finger and stick it in the air and say, where do I need to add capacity?” Kamel said.

He explained that with the NLM and digital twin approach, Selector is able to accurately assess capacity needs. That might include adding new hardware, or re-provisioning existing under-utilized assets to needed locations.

Programmable synthetic sensors for end-to-end visibility

Rounding out the updates, Selector AI has introduced a feature called programmable synthetic sensors, which allows customers to generate synthetic network traffic to exercise specific paths and ensure application performance. 

Selector AI can generate synthetic requests and send them to various application endpoints around the world from different sources. This capability enables enterprises to proactively monitor the end-to-end functionality of their distributed applications, providing early warning of performance regressions.

Looking forward, Kamel said that the plan is to continue to iterate on the Selector AI platform and expand the capabilities of the NLM. The focus is on adding value by helping organizations to find and solve problems within the network layer all the way up to the application and everything in between.