Apple | The Knowledge Dynasty

Apple

Apple acquired augmented reality headset startup Vrvana for $30M

As Apple reportedly ramps up work to ship an augmented reality headset in 2020, it has acquired a startup from Montreal, Canada that could help it get there. TechCrunch has learned that Apple has acquired Vrvana, maker of the Totem headset — which had rave reviews but never shipped. The deal was for around $30 million, two sources tell TechCrunch.

We contacted Apple, and the company declined to comment, but also did not deny the story. Vrvana did not reply to our request for comment. Sources close to the deal have confirmed the acquisition to us.

The deal is significant because while we have seen reports and rumors about Apple’s interest in AR hardware, the company has been very tight-lipped and generally is very secretive about completely new, future products. This acquisition is perhaps the clearest indicator yet of what the company is hoping to develop.

A number of the startup’s employees have joined Apple in California. The Vrvana site is currently still up, but it stopped updating social accounts and news in August of this year.

It’s not clear what of Vrvana’s existing products, product roadmap or current business — it worked with Valve, Tesla, Audi and others under NDA — will be making its way to Apple.

The only product that Vrvana shows off on its site is the unreleased Totem headset, an “extended reality” device utilizing key technologies from both AR and virtual reality to allow for both experiences on a single headset.

A screen grab from one of Vrvana’s promotional videos for the Totem.

The tethered device had a form factor similar to many of today’s VR headsets, but uniquely relied on several forward-facing pass-through cameras to replicate the outside world on its OLED displays inside the headset. The system of cameras enabled 6DoF tracking, a technology which allows the device to track its position in 3D space, while also using infrared cameras to track a user’s hands.

Vrvana’s camera-based AR approach differs from competitors like Microsoft, which is utilizing transparent, projection-based displays for its HoloLens headset. The Totem holds a number of advantages over these systems, most notably in that it is able to overlay fully opaque, true-color animations on top of the real world rather than the ghost-like projections of other headsets which critically cannot display the color black. This allows the headset to do what it calls “seamless blend” transitions between VR and AR environments.

A key disadvantage in these types of systems, aside from bulky aesthetics, is that there is often noticeable lag between the cameras capturing the outside world and how quickly it is displayed in-headset. Vrvana CEO Bertrand Nepveu detailed this problem in a talk this summer where he shared that the startup had working prototypes that brought this latency down to 3 milliseconds.

An animation showcasing how the Totem smoothly transitions between AR and VR modes.

There are consumer applications for this kind of “extended reality” technology — for example, in games and other entertainment — but one key focus for Vrvana was enterprise usage.

“Totem’s hand tracking and inside-out positional tracking empowers your workforce to manipulate virtual objects with their hands wherever they please,” the company said in promotional materials on the headset.

This is notable considering Apple’s focus — both on its own and in partnership with other IT providers like IBM, Cisco and SAP — to court different enterprise verticals. In August, CEO Tim Cook singled out enterprise as one key focus for its AR ambitions, and in its last earnings the company reported double-digit growth in the area. The company last broke out its enterprise sales back in 2015, when Cook described it as a $25 billion business.

But scaling remains one of the hardest things for startups — especially hardware startups — to do, and this is even more the case for startups working in emerging technologies that have yet to break into the mainstream.

Founded back in 2005, Vrvana had not disclosed much of its funding. A source tells TechCrunch the company raised less than $2 million, a modest figure in the world of hardware. Investors according to PitchBook included Real Ventures (whose partner Jean-Sebastian Cournoyer is also involved with Element.ai, an ambitious AI startup and incubator in Montreal), the Canadian Technology Accelerator, and angel Richard Adler, who is also active in other VR startups.

Up to now, Apple has been fairly critical of the state of VR and AR hardware in the market today, and it has downplayed its own hand in the game.

“Today I can tell you the technology itself doesn’t exist to do that in a quality way. The display technology required, as well as putting enough stuff around your face – there’s huge challenges with that,” Cook told The Independent in answer to a question about whether it was building a headset. “The field of view, the quality of the display itself, it’s not there yet…We don’t give a rat’s about being first, we want to be the best, and give people a great experience. But now anything you would see on the market any time soon would not be something any of us would be satisfied with. Nor do I think the vast majority of people would be satisfied.”

That’s not to say that Apple has not been enthusiastic about the augmented reality space. But to date, this interest has largely manifested itself through software — specifically the company’s iOS-based ARKit SDK — and the increasingly sophisticated camera arrays on the iPhone rather than through a dedicated device, although there have been plenty of Apple patents that also potentially point to one.

Apple also has made other acquisitions that underscore its interest in developing the technology that powers the hardware. In June, Apple acquired SMI, an eye-tracking firm that was working on solutions for VR and AR headsets. Other AR and VR-related acquisitions have included Flyby MediametaioEmotient, and Faceshift.

Read more: https://techcrunch.com/2017/11/21/apple-acquires-mixed-reality-headset-startup-vrvana-for-30m/

Google’s ‘Pixel Buds’ may be the key to breaking the language barrier

Mandatory Credit: Photo by AP/REX/Shutterstock (9114955r)
Google Pixel Buds are shown at a Google event at the SFJAZZ Center in San Francisco
Google Showcase, San Francisco, USA – 04 Oct 2017

Image: AP/REX/Shutterstock

Out of all the products Google launched at its big event this week, there’s one that should have Apple really worried.

No, it’s not the Pixel phones (though they certainly seem like worthy iPhone competitors) or the MacBook-like Pixelbook, it’s the Pixel Buds.

More than any other gadget Google launched, the $159 Pixel Buds (which, by the way, are already out of stock on Google’s store), perfectly encapsulate how Google can use it’s incredible AI advantage to beat Apple at its own game.

To be clear, this isn’t about whether the Pixel Buds, as they are right now, are better than AirPods. I’m on record as a huge fan of my AirPods, and I walked away from my first Pixel Buds demo less impressed with the look and feel of Google’s ear buds.

But I’m talking about much more than just aesthetics, which are easily fixed (particularly now that Google has an extra 2,000 engineers from HTC onboard).

No, it was this — Google’s first public demo of the Pixel Buds — that should have Apple very, very worried.

That demo is perhaps Google’s best example of how its new “AI-first” vision can completely and radically change its hardware — and its ability to compete with Apple. Pixel Buds, which have Google Assistant and real-time translation for 40 languages built right in, are, for now, Google’s best example of this vision.

But Pixel Buds are only the beginning.

These types of integrations will make their to the rest of Google’s hardware faster than you can say “talking poop emoji.” There are already signs of it. The Pixel Phones use algorithms — not extra lenses — to enable portrait mode and an overall smarter camera. The new Google Home Max uses AI to make its sound better. And Google’s first-class computer vision capabilities — whether it’s in the Lens app, the Clips camera, or the Pixelbook’s image search — has the potential to completely change how you use cameras, and laptops, and smartphones.

So while Apple has the iPhone 8 and the massively hyped iPhone X for now — even I won’t pretend Google has a shot at outselling Apple in the near term — Google’s AI is so much farther ahead of Apple’s it’s almost laughable.

Yes, Cupertino has made a concerted effort to step up its AI recently, particularly when it comes to Siri. And the company’s latest iPhones are unquestionably its smartest yet. But FaceID and talking emoji pale in comparison to Google’s dominance.

And nowhere is that more evident than Pixel Buds.

Read more: http://mashable.com/2017/10/06/google-pixel-buds-apple-ai/

How secure is Apple’s Face ID, really?

In its latest product event, Apple confidently moved to convince consumers that face recognition is the most convenient way to secure your phone and the sensitive information you store in it. Face ID, the company’s face recognition technology, which will be replacing its fingerprint scanner in the new iPhone X, requires you to only show your face to your phone in order to unlock it, to confirm ApplePay payments, in iTunes and App Store.

According to Phil Schiller, senior vice president of marketing at Apple, “With the iPhone X, your iPhone is locked until you look at it and it recognizes you. Nothing has ever been more simple, natural, and effortless. This is the future of how we’ll unlock our smartphones and protect our sensitive information.”

To be sure, showing your face to your phone is easier than typing a passcode or pressing your finger against a scanner. It saves you a few seconds, you obviously can’t forget it, and it won’t be affected by moisture and oil.

But is it more secure?

Here are the key things you should consider about facial recognition before you enroll in the latest fad that is overcoming the iPhone and other major smartphones.

Can Face ID be spoofed?

Face recognition authentication has existed for several years, but it has become notoriously renowned for its security flaws. Researchers and cybercriminals have been able to easily circumvent face locks on various devices by using hi-res pictures and videos of the owners. And as opposed to passwords, your face is not a secret. It’s available to anyone who Googles your name or gets close enough to snap a picture of you. Even Samsung’s S8 face lock was proven to be fooled by a photo.

However, Face ID has incorporated a technology to make it exponentially harder to bypass the lock. During setup, Face ID projects 30 thousand infrared dots to create a 3D depth map of its owner’s face. It subsequently uses that map during authentication to make sure that it’s a real face standing before the camera and the physical features correspond to those of the owner.

READ MORE:

Getting around depth maps will be much more difficult than using flat images. Apple says not even professionally made masks will work. Some experts believe it’s not impossible to fool, however, and it’s only a matter of time and “enough external data” before the technology can be sidestepped. And per Apple, if you have an identical twin, Face ID may be fooled to mistake them for you.

Further, depth sensors like the ones used in the iPhone X do have their own technical challenges. They might fail under distinct conditions such as intense light or when you’re wearing a hat or scarf. Apple says that it works under various conditions, but we’ll have to certify when the device actually ships.

Can Face ID be forcibly activated?

This is a question that regards all biometric authentication mechanisms, including fingerprint scanners. If you’re captured by criminals or taken into the custody of law enforcement, can they unlock your phone by holding it up to your face?

Unfortunately, they can. The technology doesn’t work if you’re not staring at it or if you close your eyes but is not yet smart enough to understand the difference between a real unlock attempt and a forced one (maybe someday it will). In the case of police, at least, they would legally be required to obtain a warrant before forcing you to unlock the device, according to legal experts.

Apparently, Apple recognizes this as a possible flaw in its technology. In iOS 11, users have to enter the iPhone’s passcode when connecting it to a new computer. This will make it harder to siphon data from a phone unlocked forcibly. Apple has also made it possible to disable Face ID and Touch ID, its fingerprint-scanning technology, by pressing the Home or Power button (depending on the device model) five times in rapid succession.

Where does Apple store your face data?

Your mug is not the most private part of your body. Governments have huge databases of citizens pictures, the internet may be flooded with pictures of you and your friends if you’ve been on social media in the past years, and facial recognition is already a serious privacy concern.

Nonetheless, you should be concerned about where your data is stored and how secure it is, especially the depth map of your face, which is still somewhat private. Most facial recognition software relies on machine learning algorithms, programs that work with huge data sets that are stored on cloud servers. Companies running these types of software need to collect more and more data samples to improve their performance. They might also mine the data for other commercial purposes or share it with third parties.

For the moment, Apple has made it clear that no face data will be leaving your phone, the same approach it has used on Touch ID. Everything will be computed on the device thanks to its powerhouse A11 processor, and sensitive data will be stored on the Secure Enclave, the most secure component of the iPhone.

Screenshot via Apple

Apple’s Phil Schiller shows off Face ID on the iPhone X.

How much data does Face ID collect?

This is perhaps the creepiest side of Face ID. The technology has no manual trigger on iPhone X. You only need to hold it in front of your face to activate it, which means it’s always watching, waiting for your face to show up. How much data it stores is an open question.

But we’ve seen similar functionality cause privacy controversies in the Echo, Amazon’s smart home system. And unlike the Echo, your iPhone doesn’t remain in your home. You take it with you wherever you go.

Moreover, there’s the question of what Apple will do with the technology once it has access to millions of people’s faces. The company didn’t have much incentive to collect fingerprint data. But face and gaze information is a totally different matter and can be used for things such as tracking attention and reaction to ads. We’ll have to see if Apple will resist the urge to make use of the technology in other potentially profitable endeavors.

For most users, Face ID will provide a secure and reliable way to protect your iPhone, with decent workaround against most of its flaws. Apple says it has 1/1,000,000 chance of getting unlocked by someone other than you, as opposed to TouchID, which stood at 1/50,000.

However, if you prefer privacy over convenience (as I do), remembering and typing a passcode is a small price to pay for higher security.

Ben Dickson is a software engineer and the founder of TechTalks. Follow his tweets at @bendee983 and his updates on Facebook.

Read more: https://www.dailydot.com/layer8/iphone-x-face-id/

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Categories

Coursera

New Skills, New You: Transform your career in 2016 with Coursera

Follow us on Twitter