industry news
Subscribe Now

New Relic Integrates its Observability Platform with NVIDIA NIM to Accelerate AI Adoption and ROI

New Relic integrates with NVIDIA NIM inference microservices to help deliver high-performing models optimized for NVIDIA GPUs

New Relic AI monitoring provides in-depth insights across the AI stack for apps built on NVIDIA NIM

New Relic’s platform centralizes data from 60+ AI integrations to provide comprehensive observability

SAN FRANCISCO—June 24, 2024—New Relic, the all-in-one observability platform for every engineer, announced it is integrating its platform with NVIDIA NIM inference microservices to reduce the complexity and costs of developing, deploying, and monitoring generative AI (GenAI) apps. Now, customers can use New Relic AI monitoring to gain comprehensive visibility across the AI stack for applications built with NVIDIA NIM, all with a simplified setup and while ensuring data security. This complements the robust security features and ease of use of NVIDIA NIM’s self-hosted models, which accelerates generative AI application delivery. Together, New Relic integrated with NVIDIA NIM can help customers adopt AI faster and achieve quicker ROI.

Observability is Mission Critical to Deploying Cost-Effective, High-Performance Models 

Organizations are rapidly adopting generative AI to enhance digital experiences, boost productivity, and drive revenue. Gartner predicts that over 80% of enterprises will use GenAI or deploy GenAI apps by 2026. Quick deployment and faster ROI are crucial for organizations to gain market advantage and Observability is the key. It offers a holistic, real-time view of the AI application stack – across services, infrastructure, and the AI layer – to ensure efficient, reliable, and cost-effective operation.

“In today’s hyper-competitive market, organizations cannot afford to wait years for AI ROI,” said New Relic CEO Ashan Willy. “Observability solves this by providing visibility across the AI stack. We are pioneering AI observability by extending our platform to include AI apps built with NVIDIA NIM. Combining NVIDIA’s AI technology with our expertise in observability and APM gives enterprises a competitive edge in the AI race.”

“As enterprises race to adopt generative AI, NVIDIA NIM can help businesses quickly deploy applications in production,” said NVIDIA Director of AI Software Amanda Saunders. “New Relic’s integration with NVIDIA NIM enables IT and development teams to optimize their AI applications by rapidly observing and responding to operational insights.”

New Relic Speeds Faster ROI for AI Applications Built with NVIDIA NIM

AI applications can complicate tech stacks, increase security concerns, and be cost prohibitive. New Relic AI monitoring provides a comprehensive view of the AI stack, along with key metrics on throughput, latency, and costs while ensuring data privacy. It also traces the request flows across services and models to understand the inner workings of AI apps. New Relic extends its in-depth monitoring to NVIDIA NIM, supporting a wide range of AI models including–Databricks DBRX, Google’s Gemma, Meta’s Llama 3, Microsoft’s Phi-3, Mistral Large and Mixtral 8x22B, and Snowflake’s Arctic. This helps organizations deploy AI applications built with NVIDIA NIM confidently, accelerate time-to-market, and improve ROI.

Key features and use cases for AI monitoring include:

  • Full AI stack visibility: Spot issues faster with a holistic view across apps, NVIDIA GPU-based infrastructure, AI layer, response quality, token count, and APM golden signals.
  • Deep trace insights for every response: Fix performance and quality issues like bias, toxicity, and hallucinations by tracing the entire lifecycle of AI responses
  • Model inventory: Easily isolate model-related performance, error, and cost issues by tracking key metrics across all NVIDIA NIM inference microservices in one place
  • Model comparison: Compare the performance of NVIDIA NIM inference microservices running in production in a single view to optimize model choice based on infrastructure and user needs.
  • Deep GPU insights: Analyze critical accelerated computing metrics such as GPU utilization, temperature, and performance states; understand context and resolve problems faster.
  • Enhanced data security: In addition to NVIDIA’s self-hosted model’s security advantage, New Relic allows you to exclude monitoring of sensitive data (PII) in your AI requests and responses.

New Relic deepens its 60+ AI integration ecosystem with NVIDIA 

This integration follows New Relic’s recent addition to NVIDIA’s AIOps partner ecosystem. Leveraging NVIDIA AI’s accelerated computing, New Relic combines observability and AI to streamline IT operations and accelerate innovation through its machine learning, and generative AI assistant, New Relic AI. New Relic offers the most comprehensive observability solution with 60+ AI integrations including NVIDIA GPUs and NVIDIA Triton Inference Server software.

New Relic AI monitoring is available as part of its all-in-one observability platform and offered via its usage-based pricing model. Get started by contacting your New Relic account representative or signing up for a free account.

For more information, please visit:

Leave a Reply

featured blogs
Jun 13, 2024
I've just been introduced to the DuoFlex 4K Dual-Screen Display from HalmaPixel, and now I'm drooling with desire all over my keyboard....

featured video

Unleashing Limitless AI Possibilities with FPGAs

Sponsored by Intel

Industry experts discuss real-world AI solutions based on Programmable Logic, or FPGAs. The panel talks about a new approach called FPGAi, what it is and how it will revolutionize how innovators design AI applications.

Click here to learn more about Leading the New Era of FPGAi

featured paper

DNA of a Modern Mid-Range FPGA

Sponsored by Intel

While it is tempting to classify FPGAs simply based on logic capacity, modern FPGAs are alterable systems on chips with a wide variety of features and resources. In this blog we look closer at requirements of the mid-range segment of the FPGA industry.

Click here to read DNA of a Modern Mid-Range FPGA - Intel Community

featured chalk talk

Silence of the Amps: µModule Regulators
In this episode of Chalk Talk, Amelia Dalton and Younes Salami from Analog Devices explore the benefits of Analog Devices’ silent switcher technology. They also examine the pros and cons of switch mode power supplies and how you can utilize silent switcher µModule regulators in your next design.
Dec 13, 2023
26,690 views