• I tried iPhone 16s Visual Intelligence and now I understand why A

    From TechnologyDaily@1337:1/100 to All on Thursday, October 24, 2024 14:15:05
    I tried iPhone 16s Visual Intelligence and now I understand why Apple added Camera Control

    Date:
    Thu, 24 Oct 2024 12:48:46 +0000

    Description:
    I tried Apple Intelligence's Visual intelligence on the iPhone 16 Pro Max and Camera Control now makes sense.

    FULL STORY ======================================================================

    I have been waiting to try Visual Intelligence since Apple first unveiled the iPhone 16 in September. After all, theres a new button ( Camera Control ) on my iPhone, and I havent been using it for photography.

    Camera Control hasn't clicked with me after a month with the iPhone 16 Pro
    Max . I take a decent amount of photos with my iPhone, but any time I try to use the dedicated button, it feels cumbersome and confusing so much so that Ive resorted back to my trusty touchscreen and Lock Screen shortcut.

    Enter iOS 18.2 Developer Beta , and I must emphasize: Developer Beta. Please do not try this on your primary device as its very much so still in development and not ready for the public. Anything I write about in this article is related to the feature itself, not the performance. Further, if
    you want to try Visual Intelligence for yourself, I recommend waiting until the official release of iOS 18.2 later this year.

    I digress, back to Visual Intelligence. iOS 18.2 is now installed on my
    iPhone 16 Pro Max, and so far, Im a big fan at the prospect of what Visual Intelligence can become.

    What is Visual Intelligence? I hear you ask. Well, its an Apple Intelligence feature exclusive to the iPhone 16 lineup and takes full advantage of Camera Control. You launch it by long-pressing Camera Control and then snap a photo of whatever youre looking at. From there, you can ask ChatGPT for
    information, search Google, or highlight any text in the photo. Think of Visual Intelligence as Apples version of Google Lens with its own hardware button to access on the fly. My first impressions of Visual Intelligence (Image credit: Future / Apple)

    You can launch Visual Intelligence from anywhere, even the Lock Screen, which makes it incredibly useful whenever you want to do a quick search. My first test was taking a picture of my Game Boy Camera on my desk. As mentioned above, visual Intelligence gives you a few options, so I first used Google Search to find the product. Then, I asked ChatGPT for information, and it was able to tell me all about the Game Boy Cameras history. From there, you can ask follow-up questions, so I asked, When did the Game Boy Camera launch in Europe? ChatGPT obliged with the correct answer.

    While its still in development, Visual Intelligence worked a treat with a recognizable product like the Game Boy Camera - Im not sure how often Id use it to search for an item, but considering its just a simple long press away, it might become my go-to way of searching the web for things.

    Another great use for Visual Intelligence is when youre out and about and
    want to see information about a shop, cafe, bar, or restaurant. I tested it with a local coffee shop, and while it didnt work like Apple showed off in
    its demo, I think thats more down to the early beta version Im testing than the feature itself.

    In that demo, Apple showed that Visual Intelligence could determine a dog breed. I tried this with my French Bulldog, and while I could search Google for similar dogs, it couldnt give me a straight-up answer. (Image credit: Apple)

    That kind of sums up Visual Intelligence in its current form. It has huge potential: I love the way it gives Camera Control a genuine purpose, and when it works, its fantastic. But its in very early development, and theres a lot that, as expected, needs ironing out.

    One thing is for sure, however: Visual Intelligence makes total sense to me now, and I finally understand why Apple added Camera Control to the new iPhones. Its the kind of Apple Intelligence feature that I can see people turning to when they need a quick answer, as long as it works smoothly, and the ChatGPT and Google integration make it multi-faceted.

    I love testing new iOS features, every year my iPhones lifespan is mostly spent in a beta state, and the iOS 18.2 developer beta feels like the most exciting one yet. After just a few hours with the software and without access to Genmoji or Image Playground yet (Im on the waitlist), I can still confidently say that iOS 18.2 feels like the iOS 18 and Apple Intelligence we were waiting for.

    Ive just had a glimpse of what Visual Intelligence has to offer, and Im incredibly excited to see the finished product later this year. Exclusive to the best iPhones , this could be the reason to buy an iPhone 16 - who wouldve thought it could be Camera Control? You might also like... I can't wait for this underrated Apple Intelligence feature I installed the Apple Intelligence public beta My parents iPhone is not getting Apple Intelligence and they're not happy about it



    ======================================================================
    Link to news story: https://www.techradar.com/phones/iphone/i-tried-iphone-16s-visual-intelligence -and-now-i-understand-why-apple-added-camera-control


    --- Mystic BBS v1.12 A47 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)