Humans have a strong ability to apply learned knowledge to new situations, with compositionality being key – the ability to break things down into reusable parts. Researchers at OIST in Japan developed a brain-inspired AI model to teach a robot language and physical actions together, allowing it to generalize and understand new commands. By combining vision, movement, and language, the robot learned to follow instructions, with visual attention and working memory being crucial for accurate learning. This study sheds light on how humans and AI can learn through a combination of language and physical experience, leading to more interactive and human-like robots in the future.
Full Article
Bumble Bee Foods Is Accused of Tolerating Forced Labor in Supply Chain
The lawsuit filed by four Indonesian fishermen against Bumble Bee Foods accuses the company of benefiting from forced labor on tuna ships where workers faced abuse, starvation, and lack of medical care. Greenpeace's findings raised concerns about tainted products in U.S. stores, prompting Bumble Bee to remove misleading claims from its marketing materials. The plaintiffs seek unspecified damages under a law allowing survivors of human trafficking to sue companies complicit in forced labor, shedding light...
Read more