In Apple, Senior Vice President of Software Engineering (SVP) Craig Federighi announced that the tech veterans have now started access for third party app developers up to their on device foundation model. These AI models also provide power to many Apple Intelligence features. Developers can access these AI models to create new features inside their app or create a completely new app through Foundation Model Framework.
Apple emphasized that these are on device models, so this AI capacity will work even when there is no internet connection. This will also ensure that users data will never go out of the device. For the developers, they will not have to offer any application programming interface (API) cost for on cloud points. This framework supports Swift, allowing developers to easily access the AI model. Apart from this, it supports a lot including framework guided generation, tool calling.
New apple intelligence feature
Federighi revealed Siri will not get the Advance AI feature teased in the last year’s WWDC till 2026, about which Apple will share more information. However, this year Cupertino based tech giants are planning to provide some more Apple intelligence features.
Live translation
Live translation AI based feature is being integrated into message apps, facetime and phone apps, so that users will be able to easily convert to those who speak a different language. This is an on device feature, so that the conversion will not go out of the users’ device. Live translation will translate the message to the automatic message app. At the same time, the feature on facetime calls will include live captions in the language of automatic users. Live translation during phone calls will translate what the person is saying in real time.
Visual intelligence
Apple is also updating visual intelligence. Iphone users can now ask the Chatgpt question by looking at the camera of their device. Openai Chatbot will know what the users are watching and will understand the context to answer the users’ questions. It can also search on apps like Google and Etsy so that the same image and product can be found. Users can highlight a product in their camera and search it online.
Workout Buddy
AI features are also available in Apple Watch. This new workout of Workout Buddy gives the workout data and fitness history of the user to the person’s personnel motivational information during the workout. This feature collects data like heart rate, pace, distance, personal fitness milestone. The company’s new text to speech model converts these information into voice based output. Workout Buddy will be available on Apple Watch with Bluetooth headphones. For this, Apple Intelligence based iPhone is also required nearby. This feature will first be available in English for selected workouts such as outdoor and indoor running and walking, outdoor cycling, high intensity interval training and functional and traditional strength training.
Genmoji and Image Playground
Genmoji and Image Playground are also being updated this year. In Genmoji, users will now be able to mix emoji together and include text prompts for new variations. Using Genmoji and Image Playground, you will also be able to change hairstyles such as expressions and other things, making an image with family and friends. The image playground is also being integrated with Chatgpt to provide a new image style.
Shortcut
Apple is also integrating Apple Intelligence in its shortcut app. Users will be able to see tasks such as somizing the text with a writing tool or making image with image playground. Users will be able to use both on-device and private cloud compute models to generate response involveing in the rest of their shortcuts.
Discover more from Gautam Kalal
Subscribe to get the latest posts sent to your email.
Be First to Comment