Tag: transformer model
Are We Running Out of Training Data for GenAI?
The advent of generative AI has supercharged the world’s appetite for data, especially high-quality data of known provenance. However, as large language models (LLMs) get bigger, experts are warning that we may be runn Read more…
d-Matrix Gets Funding to Build SRAM ‘Chiplets’ for AI Inference
Hardware startup d-Matrix says the $44 million it raised in a Series A round today will help it continue development of a novel “chiplet” architecture that uses 6 nanometer chip embedded in SRAM memory modules for ac Read more…
10 NLP Predictions for 2022
Natural language processing (NLP) has been one of the hottest sectors in AI over the past two years. Will the string of big data breakthroughs continue into 2022? We checked in with industry experts to find out. There Read more…
Nvidia Inference Engine Keeps BERT Latency Within a Millisecond
It’s a shame when your data scientists dial in the accuracy on a deep learning model to a very high degree, only to be forced to gut the model for inference because of resource constraints. But that will seldom be the Read more…