SK Hynix has launched mass production of the world’s first 12-layer HBM3E memory with a capacity of 36GB, achieved by reducing the thickness of each DRAM chip by 40%. The memory offers a bandwidth of 9600 MT/s and is ideal for LLMs and AI workloads. Nvidia and AMD are expected to use the HBM3E in their advanced hardware, with SK Hynix prioritizing supply to Nvidia for AI applications.
Full Article
Wikipedia is giving AI developers its data to fend off bot scrapers
SummaryWikipedia is discouraging AI developers from scraping its platform by releasing a structured dataset optimized for AI training, in collaboration with Kaggle. The dataset, available in English and French, includes machine-readable article data designed for modeling and analysis while omitting references and non-written elements. This initiative aims to alleviate server strain caused by automated bots and improve data access for smaller companies and independent data scientists, with Kaggle expressing excitement about hosting the Wikimedia Foundation’s...
Read more