And it’s been a while since Apple Maps screwed up; But Apple’s biggest mistake is to place the Safari URL bar on the wrong part of the screen, according to The Verge technology report.
And over the past two years, the company’s software announcements at WWDC have been repeated and added almost exclusively – for example, last year’s iOS announcements were some improvements in the quality of FaceTime and some new types of identifiers working in Apple Wallet. . Other than that, Apple has mostly introduced new settings menus: new controls for notifications, focus mode settings, privacy tools – that kind of thing.
Apple is also a top-speed follower in software, remarkably quick to adapt and refine anyone else’s new ideas about software. Apple devices are as feature-packed, long-lasting, stable, and usable as anything you’ll find anywhere.
Many companies try to figure everything out all the time for no reason and end up creating problems where they did not exist. Apple is nothing, if not a relentlessly efficient machine, and this machine works hard to sharpen every pixel that makes its hardware.
But we’re at a turning point in technology that will demand more from Apple. It’s becoming somewhat clear that AR and VR are the next big thing for Apple, the next so-called earth-shaking industry after the smartphone.
Apple probably won’t show a headset at WWDC, but with virtual reality and augmented reality coming into our lives, everything about how we experience and interact with technology needs to change.
Apple has been showing augmented reality for years, but all it shows are demonstrations and things you can see or do on the other side of the camera.
According to the report, little of the company has emerged on how he thinks augmented reality devices will work and how we will use them. For its import devices, the company will need some new hardware and a new software model to adapt. This is what we will see at WWDC this year.
Last year, Apple announced the ability to take a photo of a piece of paper with an iPhone that automatically scans and identifies any text on the page, an AR feature that uses your phone’s camera and artificial intelligence to capture information in understand and categorize the real world.
The whole tech industry thinks this is the future – this is what Google does with Maps and Lens and what Snapchat does with its Lenses and filters. Apple needs a lot of where live text comes from.
From a simple user interface perspective, the only thing AR requires is a more efficient system to get information and get things done. No one will wear the augmented reality glasses that send them Apple Music ads and news releases every six minutes, and full-screen applications that require your individual attention will increasingly become a thing of the past.
The phrase “Use your phone without getting lost in your phone” seems to be a theme at this year’s WWDC. According to Bloomberg’s Mark Gorman, we can see that the iOS lock screen displays useful information without you having to unlock your phone.
A more visible iPhone seems like a great idea and a great way to stop people from unlocking their phones to check the weather just to find themselves deep in a TikTok hole three and a half hours later . The same goes for Interactive Widgets, which rumor has it that you can do basic tasks without having to open an application. And if Focus Mode gets a rumored improvement – and especially if Apple can make Focus Mode easier to set up and use – it can be a very useful tool on your phone and an absolutely essential tool on AR glasses.
AR will demand from programs that offer more, but also get more out of the way
Apple is expected to continue to bring its devices closer together in terms of what they do and how they do it in an effort to make its entire ecosystem more usable, with almost an entire range of Macs and iPads running on Apple’s M chip – and maybe another whole line.WWDC When the long-awaited Mac Pro comes out, there’s no reason why devices can no longer share DNA.
Universal Control, which was probably the most exciting announcement for iOS 15, even if it didn’t ship until February, is a great example of what Apple looks like to treat its many screens as part of an ecosystem.
If iOS 16 brings true multitasking to your iPad, then the iPad in your keyboard is essentially your Mac. Apple has previously avoided this convergence that it apparently embraces. And if she sees all these devices as a companion and accessory for a pair of augmented reality glasses, she’ll need them all to do the job well.
This was Apple’s last and last time – to have a whole new idea of how we use gadgets in 2007 when the iPhone was launched. Since then, the industry has followed, improved and adapted the “yes” route without deviating from the basics of multitouch. But augmented reality will break it all. It could not work otherwise. Therefore, companies are working on neural interfaces, trying to master gesture control, trying to figure out how to display everything from translated text to maps and games on a small screen in front of your face.
Meta already sends and sells his best ideas; Google comes out in the form of lens and video features. Apple needs to start showing the world how it thinks about the future of augmented reality. Headphones or no headphones This will be the story of WWDC 2022.