a phone beats a mainframe computer at a track and field sprinting event –ar 5:3 –style raw

“I’m super excited by the new eval released by Scale AI! They developed an alternative 1k GSM8k-like examples that no model has ever seen. Here are the numbers with the alt format (appendix C): GPT-4-turbo: 84.9% phi-3-mini: 76.3% Pretty good for a 3.8B model :-).” / X

PixArt Sigma is the first model with complete prompt adherence that can be used locally, and it never ceases to amaze me!! It achieves SD3 level with just 0.6B parameters (less than SD1.5). : r/StableDiffusion

Phi-3 is so good for shitty GPU! : r/LocalLLaMA

“Phi-3 notebook out! Finetune Phi 2x faster and use 50% less VRAM than HF+FA2 with @UnslothAI! Had to Mistral-fy it due to sliding window attention, & fixed the 2048/2047 SWA bug. Also unfused Attention & MLP, so some diff in QLoRA loss but 16bit the same. 

LLaVA++ (Phi-3-V) – a Hugging Face Space by MBZUAI

https://huggingface.co/spaces/MBZUAI/Phi-3-V

Heads up! You’ve scrolled to the end of this category. There may have been just one or two links (above), so go back up and double check to be sure you didn’t quickly scroll down past it.

Be Sure To Read This Week’s Main Post:

This week’s executive overview and top links are here:

AI News #31: Week Ending 05/03/2024 with Executive Summary and Top 95 Links

The post you just read is an deep dive extension of my weekly newsletter, This Week In AI, an executive summary of the top things to know in AI. Each week, I create an accessible overview for laypeople to feel confident they are conversant with the week’s AI developments. I include a curated list of must-click links of the week, to offer everyone a hands-on opportunity to explore the most intriguing updates in artificial intelligence across various categories, including robotics, imagery, video, AR/VR, science, ethics, and more. Beyond the overview, I post these topic-based deeper dives (below). If you haven’t read this week’s overview, I recommend starting there.

Credits/Sources

Most of these weekly links come from just a few prolific oversharing sources. Please follow them, as they work hard to find the news each week and they make it a lot easier for me to compile.

For previous issues, please visit the archives!

Thanks for reading!

Leave a Reply

Trending

Discover more from Ethan B. Holland

Subscribe now to keep reading and get access to the full archive.

Continue reading