Revolutionizing Cycling Experience: How Photochromic Lenses Transform Rides - Insights From ZDNet
Revolutionary AI Breakthrough: Apple’s Next-Gen Innovations Set to Transform Your iPhone Experience - Insights From the Latest Research
June Wan/ZDNET
Apple is taking a deep dive into artificial intelligence technology, according to two recently published research papers showcasing the company’s work. The research shows Apple is working to develop on-device AI tech, including a groundbreaking method to create animatable avatars and a novel way to run large language models from an iPhone or iPad.
Also: Do companies have ethical guidelines for AI use? 56% of professionals are unsure, survey says
Aptly named “LLM in a flash ,” Apple’s research on efficiently running LLMs on devices with limited memory enables complex AI applications to run smoothly on iPhones or iPads. This could also involve running a generative-AI-powered Siri on-device that simultaneously assists with various tasks, generates text, and features an improved ability to process natural language.
HUGS stands for Human Gaussian Splats, a method to create fully animatable avatars from short video clips captured on an iPhone in as little as 30 minutes. HUGS is a neural rendering framework capable of training with as little as a few seconds of video to create a detailed avatar that users can animate however they’d like.
Newsletters
ZDNET Tech Today
ZDNET’s Tech Today newsletter is a daily briefing of the newest, most talked about stories, five days a week.
Subscribe
What this means for the iPhone and Vision Pro
There have been reports about Apple working on its own AI chatbot , used internally and called ‘Apple GPT.’ The new research shows that the company is making strides in running LLMs by leveraging flash memory on smaller, less powerful devices like an iPhone. This could make sophisticated generative AI tools available on-device and could mean a generative AI-powered Siri.
Also: Microsoft Copilot can write songs for you now. Here’s how to try it
Beyond Siri’s much-needed improvement, having an efficient LLM inference strategy like the one described in LLM in a Flash could lead to more accessible generative AI tools, significant advancements in mobile technology, and improved performance in a wide range of applications on everyday devices.
Arguably the biggest advancement of the two, HUGS is a method that can create malleable digital avatars from just a few seconds of monocular video, or 50-100 frames, to be exact. These human avatars can be animated and placed on different scenes, as the platform uses a disentangled representation of humans and scenes.
HUGS lets users create avatars of themselves that can be animated and placed on a scene. This is Apple’s example of three avatars animated in sync.
Apple
According to Apple, HUGS outperforms competitors at animating human avatars with rendering speeds 100 times faster than previous methods and with a significantly shorter training time of only 30 minutes.
Creating an avatar by leveraging the iPhone’s camera and processing power could deliver a new level of personalization and realism for iPhone users in social media, gaming, educational, and augmented reality (AR) applications.
HUGS could seriously reduce the creep factor for the Apple Vision Pro’s Digital Persona , showcased during the company’s last Worldwide Developers’ Conference (WWDC) last June. Vision Pro users could wield the power of HUGS to create a highly realistic avatar that can move fluidly with a 60fps rendering time.
Also: Apple’s Vision Pro may launch in February - with its most sophisticated buying process yet
The speed of HUGS would also allow for real-time rendering, which can be crucial for a smooth AR experience and could enhance social, gaming, and professional applications with realistic, user-controlled avatars.
Apple tends to shy away from using buzzwords like ‘AI’ to describe its product features, preferring to focus on machine learning instead. However, these research papers suggest a deeper involvement in new AI tech. Still, Apple hasn’t publicly acknowledged implementing generative AI into its products and has yet tto confirm its work with Apple GPT officially
Artificial Intelligence
How I used ChatGPT to scan 170k lines of code in seconds and save me hours of detective work
6 ways to write better ChatGPT prompts - and get the results you want faster
6 digital twin building blocks businesses need - and how AI fits in
Google’s Gems are a gentle introduction to AI prompt engineering
- How I used ChatGPT to scan 170k lines of code in seconds and save me hours of detective work
- 6 ways to write better ChatGPT prompts - and get the results you want faster
- 6 digital twin building blocks businesses need - and how AI fits in
- Google’s Gems are a gentle introduction to AI prompt engineering
Also read:
- [New] In 2024, Swift Fixes for Non-Functional Facebook Video Sharing on Android/iOS
- [Updated] 2024 Approved Bandicam or Camtasia Best Recording & Editing Software?
- [Updated] Essential Instagram Video & Photo Downloads Guide for 2024
- 2024 Approved Quick Guide Installing & Exploring iFunny's Meme App
- Dissolving Barriers with Artificial Insight
- Download the Latest EmEditor Pro 7.00 Pre-Release - Leading Text Editor Software
- Expertise in Image Colors Pro Techniques Revealed
- Navigate to New Norms: OpenAI's Innovative GPT Marketplace
- Quick Fixes to Repair Microsoft Excel 2016 Content related error | Stellar
- Step-by-Step Guide: Setting Up Local Llama 2 Installer
- Top 7 AI Software for Mastering Mathematical Challenges
- Understanding and Solving the Body Stream Hiccup: A Guide
- Title: Revolutionizing Cycling Experience: How Photochromic Lenses Transform Rides - Insights From ZDNet
- Author: Larry
- Created at : 2024-10-30 12:48:17
- Updated at : 2024-11-01 19:24:52
- Link: https://tech-hub.techidaily.com/revolutionizing-cycling-experience-how-photochromic-lenses-transform-rides-insights-from-zdnet/
- License: This work is licensed under CC BY-NC-SA 4.0.