Image created with gemini-2.5-flash-image with claude-sonnet-4-5. Image prompt: Photorealistic wide shot of six Ionic limestone columns on Mizzou quad at golden hour, their bases transformed into neatly stacked shipping-box shapes in matching limestone texture, topped with classical entablature carved with AMAZON in Roman serif capitals, red brick buildings and clear blue sky behind, a faint delivery drone shadow crossing the grass, warm late-afternoon light with long soft shadows.
Kindle Translate: AI-powered service for multilingual eBooks https://www.aboutamazon.com/news/books-and-authors/amazon-kindle-translate-books-authors
🤖 From this week’s issue: Amazon introduced Chronos-2, a foundation model designed to handle arbitrary forecasting tasks — univariate, multivariate, and covariate informed — in a zero-shot manner. https://x.com/dl_weekly/status/1985346603108991015
Bullying is Not Innovation https://www.perplexity.ai/hub/blog/bullying-is-not-innovation
(32) ElevenLabs CEO: Why Voice is the Next AI Interface – YouTube https://www.youtube.com/watch?v=ZqCEHR4wjxg
$38B compute deal: OpenAI is accessing AWS compute comprising hundreds of thousands Nvidia GB200 and GB300 chips https://x.com/scaling01/status/1985352400631202187
Announcing strategic partnership with AWS, to help scale the compute required for AI that benefits everyone:”” / X https://x.com/gdb/status/1985378899648544947
AWS and OpenAI announce multi-year strategic partnership | OpenAI https://openai.com/index/aws-and-openai-partnership/
AWS announces new partnership to power OpenAI’s AI workloads https://www.aboutamazon.com/news/aws/aws-open-ai-workloads-compute-infrastructure
OpenAI strikes $38 billion AI training deal with Amazon | The Verge https://www.theverge.com/news/812443/openai-amazon-38-billion-cloud-computing-ai
Very pleased to be working with Amazon to bring a lot more NVIDIA chips online for OpenAI to keep scaling!”” / X https://x.com/sama/status/1985431030430646365
Enabling Trillion-Parameter Models on AWS EFA https://research.perplexity.ai/articles/enabling-trillion-parameter-models-on-aws-efa
Perplexity is the first to develop custom Mixture-of-Experts (MoE) kernels that make trillion-parameter models available with cloud platform portability. Our team has published this work on arXiv as Perplexity’s first research paper. Read more: https://x.com/perplexity_ai/status/1986101355896098836
New multi-year, strategic partnership with @OpenAI will provide our industry-leading infrastructure for them to run and scale ChatGPT inference, training, and agentic AI workloads. Allows OpenAI to leverage our unusual experience running large-scale AI infrastructure securely, https://x.com/ajassy/status/1985351258333643172
In discussions of AI and jobs, we put too much emphasis on the technology and not enough on the corporate leaders who are actually making decisions about what they want to do with AI It is a time where CEO vision matters a lot, and you can see a contrast in Amazon and Walmart https://x.com/emollick/status/1983879487390728248





Leave a Reply