editor's blog
Subscribe Now

The Other KBP

We looked today at Netlogic’s NETL7 “knowledge-based processors” (KBPs). But the KBP category contains more than just this family: it also contains one called Sahasra.

But the Sahasra products aren’t just more of the same of what’s in NETL7. NETL7 deals with packets all the way to the application layer (layer 7) in the OSI stack. That’s the essence of deep packet inspection (DPI): you’re not just looking at various enclosing headers, you’re diving all the way in. That’s equivalent to moving all the way up the stack to the original application payload.

By contrast, the Sahasra products only deal with layers 2-4. These are fundamentally for packet and flow routing. The Sahasra KBPs are intended to speed things up by implementing very specific tables or databases that are specific to a particular layer or protocol. IP routing tables are a simple example of this sort of thing.

So while the NETL7 KBPs consist of engines processing match patterns, the Sahasra KBPs essentially provide dedicated fast lookup of the kinds of things needed to route packets.

I guess they’re both “processors” that work based on some knowledge – rules in one case, route or other packet information in the other – but, aside from that, they’re different beasts.

And, quite by chance (from a coincidental timing standpoint), they just announced this week volume production of their IPv6 KBP

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Industrial Internet of Things
Sponsored by Mouser Electronics and CUI Inc.
In this episode of Chalk Talk, Amelia Dalton and Bruce Rose from CUI Inc explore power supply design concerns associated with IIoT applications. They investigate the roles that thermal conduction and convection play in these power supplies and the benefits that CUI Inc. power supplies bring to these kinds of designs.
Aug 16, 2024
50,903 views