fresh bytes
Subscribe Now

Netflix is building an artificial brain using Amazon’s cloud

braincloud-inline.jpg

Nothing beats a movie recommendation from a friend who knows your tastes. At least not yet. Netflix wants to change that, aiming to build an online recommendation engine that outperforms even your closest friends.

The online movie and TV outfit once sponsored what it called the Netflix Prize, asking the world’s data scientists to build new algorithms that could better predict what movies and shows you want to see. And though this certainly advanced the state of the art, Netflix is now exploring yet another leap forward. In an effort to further hone its recommendation engine, the company is delving into “deep learning,” a branch of artificial intelligence that seeks to solve particularly hard problems using computer systems that mimic the structure and behavior of the human brain.
via Wired

Continue reading 

Image: Hong Li/Getty

Leave a Reply

featured blogs
Nov 12, 2024
The release of Matter 1.4 brings feature updates like long idle time, Matter-certified HRAP devices, improved ecosystem support, and new Matter device types....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Shift Left Block/Chip Design with Calibre
In this episode of Chalk Talk, Amelia Dalton and David Abercrombie from Siemens EDA explore the multitude of benefits that shifting left with Calibre can bring to chip and block design. They investigate how Calibre can impact DRC verification, early design error debug, and optimize the configuration and management of multiple jobs for run time improvement.
Jun 18, 2024
37,522 views