industry news
Subscribe Now

Deci Releases Powerful Open-Source Generative AI Model, DeciCoder, to Redefine Code Generation for Developers

This innovative leap in code generation, powered by a 1B-parameter LLM, sets the bar for unprecedented efficiency and performance

TEL AVIV, Israel, August 16, 2023 —  Deci, the deep learning company harnessing AI to build AI, today released DeciCoder, its inaugural foundation model in generative AI helping users generate programming language code. This groundbreaking Large Language Model (LLM), dedicated to code generation with 1 billion parameters and an expansive 2048-context window, surpasses results released in equivalent models and effectively redefines the standards of efficient code generation.

DeciCoder’s unmatched throughput and low memory footprint enables teams to achieve extensive code generation with low latency, and migrate workloads to more affordable and widely available GPUs such as the NVIDIA A10G, resulting in substantial cost savings. When DeciCoder was benchmarked on Hugging Face Inference Endpoints against well-established code LLMs such as SantaCoder, DeciCoder showcased a 22% increase in throughput, a significant reduction in memory usage, and a 1.5-2.4 percentage point improvement in accuracy on the HumanEval benchmark. Notably, when combining DeciCoder with Deci’s LLM inference acceleration library, Infery, its throughput outperforms that of SantaCoder by 350%.

“From enabling the fastest and most cost-efficient deployment for enterprises, to now branching into generative AI, we are relentlessly pushing boundaries and empowering developers with the advanced models and tools needed to effectively implement AI-powered applications across industries,”  said  Yonatan Geifman PhD, CEO & co-founder of Deci. “Utilizing DeciCoder means less operations during inference, which translates to lower computational costs.”

DeciCoder was generated using Deci’s proprietary Automated Neural Architecture Construction (AutoNAC) engine, the most advanced Neural Architecture Search (NAS)-based technology on the market. AutoNAC identifies the ideal architecture that strikes a perfect balance between accuracy and processing speed, tailored for distinct data features, tasks, performance goals, and inference environment. Deci’s AutoNAC has generated some of the world’s most efficient computer vision and NLP models such as YOLO-NASDeciBERT and DeciSeg, among others.

The rollout of DeciCoder is the first in a series of the highly anticipated releases outlining Deci’s Generative AI offering, which are due to be released in the coming weeks.  DeciCoder and its pre-trained weights are available under the permissive Apache 2.0 License, granting developers broad usage rights and positioning the model for real-world, commercial applications.

To gain early access to Deci’s Infery and upcoming generative models, please visit https://deci.ai/get-early-access-deci-generative-ai/

About Deci

Deci enables deep learning to live up to its true potential by using AI to build better AI. With the company’s deep learning development platform, AI developers can build, optimize, and deploy faster and more accurate models for any environment including cloud, edge, and mobile, allowing them to revolutionize industries with innovative products. The platform is powered by Deci’s proprietary automated Neural Architecture Construction technology (AutoNAC), which automatically generates and optimizes deep learning models’ architecture and allows teams to accelerate inference performance, enable new use cases on limited hardware, shorten development cycles and reduce computing costs. Founded in 2019, Deci’s team of deep learning engineers and scientists are dedicated to eliminating production-related bottlenecks across the AI lifecycle.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Tungsten 700/510 SMARC SOMs with Wi-Fi 6 / BLE
Sponsored by Mouser Electronics and Ezurio
In this episode of Chalk Talk, Pejman Kalkhorar from Ezurio and Amelia Dalton explore the biggest challenges for medical and industrial embedded designs. They also investigate the benefits that Ezurio’s Tungsten700 and 510 SOMs bring to these kinds of designs and how you can get started using them in your next design.
Nov 7, 2024
18,025 views