- The AI Update with Kevin Davis
- Posts
- AI Update: Cyborg Computing Chips, AI Anchors, 24GB RAM Smartphones For AI
AI Update: Cyborg Computing Chips, AI Anchors, 24GB RAM Smartphones For AI
Today's issue is a look into what the future may hold thanks to our new interest in AI
Cyborg Computing Chips: The Future of AI or a Sci-Fi Nightmare?
Imagine a world where your laptop can forget your favorite playlist or your smartphone suddenly can't remember the route to your workplace. Sounds like a bad dream, right? Well, it's a reality for our current artificial intelligence (AI) systems, which, unlike humans, suffer from a condition known as "catastrophic forgetting." But what if I told you that a group of scientists in Australia are on the brink of creating synthetic biological intelligence that could solve this problem?
Picture this: human brain cells, grown in a lab, living on a silicon chip, and learning tasks. Sounds like a scene from a sci-fi movie, right? Well, hold on to your popcorn because this is not a drill. This is the latest development in the world of AI and synthetic biology, where researchers are merging the two fields to create what they call "cyborg computing chips."
The team behind this project, led by Associate Professor Adeel Razi from Monash University, has already managed to teach lab-grown brain cells to play Pong. Yes, you read that right - these cells, living on a silicon chip, can play video games! This breakthrough has opened up a new realm of continual machine learning, where these synthetic chip brains can acquire new skills without forgetting old ones.
The best part? These chips can adapt to changes, apply previously learned knowledge to new tasks, and do it all while conserving computing power, memory, and energy. It's like having a tiny, super-efficient brain in your device.
Now, if you're thinking this sounds too good to be true, you're not alone. The idea of merging AI and synthetic biology to create programmable cyborg computing chips sounds like something straight out of a sci-fi novel. But the Australian government's Office of National Intelligence has given the project a thumbs up, backing it with a $600,000 AUD grant.
So, what does this mean for us? According to Razi, the outcomes of this research could have significant implications across multiple fields, including planning, robotics, advanced automation, brain-machine interfaces, and even drug discovery. It could give Australia a significant strategic advantage, he says.
But before you start imagining a future where your laptop becomes your best friend, remember that this technology is still in its early stages. The team is now focusing on growing human brain cells in a lab dish called the DishBrain system to better understand the biological mechanisms that underpin this continual learning ability.
If successful, this could be a game-changer for the world of AI. But for now, let's just marvel at the fact that we live in a world where scientists are growing brain cells on silicon chips and teaching them to play video games. Isn't science amazing?
So, while we wait for these cyborg computing chips to become a reality, maybe it's time to show your current devices some love. After all, they might not be able to play Pong, but they're doing their best with the brains they've got!
AI Anchors: The Future of News or a Frightening Faux-Pas?
If you've ever found yourself nodding off during the evening news, hold onto your remote because things are about to get interesting. India has introduced a new news anchor to its screens, but there's a twist - she's not human. Meet Lisa, the AI-generated news presenter that's causing quite a stir and making human news presenters sweat in their suits.
Imagine this: you're sitting in your favorite armchair, sipping your evening chai, and you switch on the news. The anchor appears on the screen, but there's something a bit... off. She blinks a little too slowly, her movements are a tad robotic, and she speaks in a monotone that would give a metronome a run for its money. Congratulations, you've just met Lisa.
Now, before we all start panicking about a robot uprising, let's break this down. Lisa, the AI news anchor, has been presenting news on Odisha TV, a local station broadcasting in eastern India. At first glance, she could pass for a human presenter. But upon closer inspection, her slow blinking and stuttering motion are definitely a bit unsettling.
Despite the slightly 'uncanny valley' vibes, Lisa's delivery is serviceable. If you're just listening to the news amidst the hum of the office or the chatter in a café, you could easily mistake her for a human. In other words, she's an acceptable match for a 24-hour medium that's often left playing in the background.
But here's the kicker: Lisa isn't alone. AI news readers are popping up in newsrooms across Indonesia, Taiwan, Kuwait, Malaysia, and China. It's like an episode of Black Mirror come to life, except without the dystopian undertones...or so we hope.
The question on everyone's lips is: what does this mean for human news presenters? Will they be replaced by their AI counterparts? Odisha TV's managing director, Jagi Mangat Panda, assures that Lisa will be a "great partner" and will handle repetitive and data analytical jobs, allowing newspeople to focus on new angles and more creative work.
While that sounds promising, it's hard not to wonder if this is the beginning of the end for human news presenters. Will we soon be getting our daily updates from AI anchors? Only time will tell. But for now, let's just hope they don't start programming Lisa to crack dad jokes. Now that would be truly terrifying.
24GB RAM in Smartphones: A Future Norm Thanks to AI
Talk of smartphones with a staggering 24GB of RAM has been circulating for a while. While this may sound excessive by current standards, it's not as far-fetched as it may seem, especially when we consider the role of AI in future devices.
The reason behind this is simple: AI is RAM-hungry. To run any AI model on a smartphone, a large amount of RAM is required. These models are typically loaded onto RAM for the duration of the workload, as it's faster than executing from storage. RAM's lower latency and higher bandwidth make it ideal for loading large language models (LLMs).
One such LLM is Vicuna-7B, which is trained on a dataset of 7 billion parameters and can be deployed on an Android smartphone via MLC LLM, a universal app that aids in LLM deployment. It uses about 6GB of RAM to interact with it on an Android smartphone and runs entirely locally without the need for an internet connection.
Companies like Qualcomm have been focusing on deploying these AI models, referring to this as "hybrid AI." This approach combines the resources of the cloud and the device to split computation where it's most appropriate, saving a lot of money in the process.
Qualcomm is also working on enabling on-device AI to get around the cost issues currently faced by companies. In the case of smartphones, Qualcomm demonstrated Stable Diffusion on an Android smartphone powered by the Snapdragon 8 Gen 2, which is capable of handling intense AI workloads.
Rumors have been swirling around the forthcoming OnePlus 12, which is said to pack up to 16GB of RAM. However, this doesn't rule out the possibility of on-device AI. Theoretically, OnePlus could provide 16GB of RAM for general usage and an additional 8GB of RAM only used for AI. In this case, the total RAM number wouldn't include the AI-specific RAM, as it would be inaccessible to the rest of the system.
While 24GB of RAM may seem like a crazy amount in a smartphone, it's a clear indication that smartphones are becoming super powerful computers, and with the introduction of features like on-device AI, they can only become more powerful.
Sample Videos From RunwayML’s Latest Release
Less than 99 hours since Runway dropped their Image-to-video update...
The videos that people are creating just from an image are ... bananas!
Here are 11 of my favorite examples: twitter.com/i/web/status/1…
— Borriss (@_Borriss_)
2:22 PM • Jul 25, 2023
Sincerely, How Did We Do With This Issue?I would really appreciate your feedback to make this newsletter better... |
That’s all for today.