Whoa, Google just dropped its second Developer Preview for the Android XR SDK. I remember it was just yesterday (okay, maybe late last year) that they gave us a peek at the first one. And let me tell you, they’ve added some juicy stuff — like, wider immersive video support. Cool, right?
So, at this big Google I/O event — or maybe it was more of a casual meet-up? — they spilled the beans on the new toolkit for app developers. And yep, it’s all about making XR-native apps or adapting current Android ones for AR. It’s like giving them a whole new playground.
Now, here’s the thing, they’ve added this MV-HEVC format for 180° and 360° videos. No clue what MV-HEVC means, really, but sounds important if you’re into 3D video stuff. It’s apparently a big deal.
And Jetpack Compose for XR? Well, that’s now a thing too. Picture adaptive UI layouts all spruced up for XR displays. SubspaceModifier and SpatialExternalSurface — whatever those are — are supposed to make everything snappier. Sounds tidy, right?
Oh, wait — here’s the kicker! ARCore for Jetpack XR’s got this hand-tracking magic now. You get like 26 posed joints for gestures. Imagine waving your hands and making things happen. And they’ve thrown in some fresh guides and whatnot to help you figure it all out.
Material Design’s jumping in too, trying to make big-screen apps fit the XR world like a glove. Or maybe more like a cozy sweater? Point is, it’s adaptable or something.
But most developers won’t have their hands on these Android XR headsets. Well, not yet anyway. Google’s emulator seems super handy until gear like Samsung’s Project Moohan or XREAL Project Aura drops. When will that be? Your guess is as good as mine.
And uh, Google’s been busy tinkering with the emulator. Now it has AMD GPU support and more. Should help folks with testing and building their app… dreams or realities. No idea why I said dreams here.
Unity’s still the king (or queen?) of XR development, and now they’re offering up some shiny new versions. Stuff like Dynamic Refresh Rate and SpaceWarp support through Shader Graph. Honestly, sounds like sci-fi stuff, but it’s real. Promise.
Plus, the Unity samples — because why not — give us peeks at hand, plane, and face tracking, among other things. Maybe next year Android XR will headline at Google I/O, but who knows?
On a tangent here, Google’s not stopping just at software. They’re whipping out two types of Android XR glasses. One is like Ray-Ban Meta Glasses, and the other does the basic stuff like showing text, pics, and maps right on the lenses. Didn’t expect to see reading on glasses. Literally.
If you wanna dive deeper into all this (and maybe make sense of terms I barely grasp), Google’s got more details somewhere out there. But hey, this should be enough for some XR chatter at your next digital coffee break.