Connect with us

Technology

Your iPhone Pro Has LiDAR: 7 Cool Things You Can Do With It

Published

on

Tim Brookes

Some high-end iPhone and iPad models have a LiDAR scanner integrated into the camera module on the back of the device. This effectively gives your device 3D scanning abilities with a few unique and interesting applications.

RELATED: What Is LiDAR, and How Will It Work on the iPhone?

What Does the LiDAR Scanner Do?

LiDAR stands for Light Detection And Ranging, but it may also be commonly referred to as “3D laser scanning” or some variation thereon. The technology works by bouncing light waves at surfaces and measuring the reaction time to determine the shape and distance of objects in the area.

Think of it like RADAR (RAdio Detection And Ranging) but for light waves. Unlike RADAR imaging, LiDAR can provide more detailed and crisper scans with smaller equipment. LiDAR uses signals that work in the nanometer range, whereas RADAR requires the use of antennas that produce radio waves at much lower frequencies.

Coupled with the software on your iPhone, the LiDAR scanner can be used to create 3D representations of objects and their surroundings. To do this you’ll need the appropriate software which you can download from the App Store. Some core iPhone features, like the Camera app, will use LiDAR in other ways.

At present, only the iPhone 12 Pro, iPhone 13 Pro, iPad Pro 11-inch (2nd and 3rd generation), and iPad Pro 12.9-inch (4th and 5th generation) are equipped with LiDAR scanners. If you look at the camera array on your device, the LiDAR scanner looks like a small round black circle.

RELATED: How to Use the iPhone Camera App: The Ultimate Guide

Create 3D Scans of Places You Love

Imagine if you had an interactive 3D model of your childhood home or a treehouse you built when you were young. Most of us keep photographs to remind us of places we once lived and loved, but what if instead of flat images we were able to take 3D scans instead?

Well if you have an iPhone or an iPad with a LiDAR scanner on the back, you can do exactly that. Having a 3D space to navigate is a lot more immersive than simply looking at a 2D image. Photographs and videos still have their place, but why not augment your memory bank with something you can experience in three dimensions instead?

This is possible with apps like Polycam, RoomScan LiDAR, and Canvas: Pocket 3D Room Scanner. Most of these apps are free to use, though there are paid upgrades that remove certain restrictions and improve the quality of the scans that you make. You can see LiDAR scanning in action in a YouTube video published by Polycam.

RELATED: How to See 3D Walking Directions in Google Maps

Buying a House? Redecorating? Scan First

Capturing a 3D model of a room or building has some very practical uses. If you’re currently looking to rent or buy a house, taking a scan of the building can help you make up your mind whether or not the place is for you. The process is very similar to taking a walk-through video or series of photographs, both of which are common practices in the real estate world.

Not only is a 3D scan more immersive, but it’s also easier to compare sizes, layout, practical space, and potential for renovation and other major work. We’d recommend taking detailed photos and videos in addition to your scan, which works best with apps like Polycam and RoomScan LiDAR.

If you’re planning major work in a home you already own, a 3D scan can give you a base from which to work in a 3D modeling app like Blender (if you’re comfortable working in such an app). Alternatively, it can provide a nice “before and after” comparison to look back on.

And lastly, selling your house without a real estate agent is surging in popularity. These apps allow you to provide 3D environment scans to potential buyers while still cutting down on expensive agent fees.

RELATED: The Best DIY Home Improvement Apps for iPhone and Android

Create Your Own 3D Assets

Photogrammetry is the act of creating 3D objects from photographic data, and it’s a time-consuming process. While the assets that photogrammetry provides are often highly accurate and detailed, the process of taking an item from a series of photographs to a finished model you can use can take hundreds of hours.

By comparison, a scan made on an iPhone or iPad with an app like Polycam can take a matter of minutes. Scanning an object is a bit like taking a video, and when you’re finished you can export a file that can be used in 3D modeling apps like Blender. Once you’ve tidied up your scan you can import objects into 3D engines like Unity and Unreal.

These engines are used heavily in game development, film, and interactive media. Conor O’Kane is a game developer with a YouTube channel who has not only used this technique but created a tutorial showing how to do this and why small developers might be interested in the process.

Scan and Share Interesting or Cherished Items

Are you a collector? Whatever it is you collect—art, plants, games consoles, or even cars—you might get a kick out of exhibiting it online, in a 3D format. Polycam is perfect for this since it includes built-in model sharing with the rest of the Polycam community, or “polyworld” as the app refers to it.

Some people share ice cream or rock crystals, others share their extensive sneaker collection. Even if you don’t have a LiDAR scanner on your iPhone or iPad, Polycam may still be worth the download just to see what people are scanning and sharing.

It’s like Instagram but for 3D models with an interactive element that other forms of media don’t come close to. It’s easy too, even if you make a bit of a mess while scanning the app has intuitive crop controls that allow you to remove background or surface objects.

Tip: For best results, place your item on a stand, pedestal, or stool before scanning

Take Better Photos in the Dark

Your iPhone and iPad already do this, so you don’t need to activate anything to get the benefit. However, if you’re putting off taking photos in the dark since you don’t trust your device’s autofocus, you might want to reconsider if you have a LiDAR-equipped device.

Since LiDAR is capable of judging distances based on how long it takes the light waves to return to the sensor, autofocus in the dark can be better calculated.

While standard cameras and non-LiDAR equipped models use contrast and phase detection autofocus (what Apple calls “Focus Pixels”) which struggle in low light, your LiDAR-equipped model fares much better. Coupled with Night mode you should be better equipped to take photos in the dark.

Measure More Accurately

You may not have realized this but Apple includes an app called Measure with iOS by default. If you’ve previously discounted and deleted it you can download Measure again for free from the App Store.

The app uses augmented reality to measure real-world distances simply by pointing your phone at a surface. Tap the plus “+” icon to start the measurement and move your device to see it in action.

iPhone Measure app

With a LiDAR scanner, augmented reality is vastly improved on the iPhone and iPad. Measure has gone from being a fun party trick to surprisingly accurate. In our tests, the app was right most of the time, with a margin of error of around 1 cm. This may depend more on how shaky your hands are than anything.

RELATED: How to Measure Distances With Your iPhone

Get More From AR Apps

Measure is just one such AR app that performs better when paired with a LiDAR-equipped iPhone or iPad. Every other AR app can benefit from the sensor, which provides a more stable experience by gauging distance using light rays rather than estimations derived from a “flat” image.

LiDAR really helps to improve the AR experience across the board, like when you’re creating art in apps like World Brush, SketchAR, and Assemblr. Need to provide remote assistance for a real-world problem? Vuforia Chalk allows you to scribble on real-world objects to help relay the point.

Try out various bits of IKEA furniture in your house with IKEA Place, or bring your child’s LEGO sets to life with LEGO Hidden Side, as demonstrated in the above YouTube video published by the Brothers Brick. If you’d rather have a more educational AR experience, Playground AR provides a whole physics sandbox to play around with.

Of course, these experiences are available on just about any recent iPhone, but the experience is considerably more stable with a LiDAR scanner. Find even more apps to enjoy in Review Geek’s roundup of the best AR apps for iPhone and Android.

RELATED: The Best Augmented Reality Apps for iPhone and Android

This Article was first live here.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

Technology

DOJ hires Microsoft's former chief economist Susan Athey as its top antitrust economist; sources say Athey will likely be recused from Apple and Google cases (Leah Nylen/Bloomberg)

Published

on


Leah Nylen / Bloomberg:

DOJ hires Microsoft’s former chief economist Susan Athey as its top antitrust economist; sources say Athey will likely be recused from Apple and Google cases  —  Susan Athey, a Stanford University professor and former chief economist at Microsoft Corp., is joining the Justice Department as its top antitrust economist.

This Article was first live here.

Continue Reading

Technology

These simple changes can make AI research much more energy efficient

Published

on

Since the first paper studying this technology’s impact on the environment was published three years ago, a movement has grown among researchers to self-report the energy consumed and emissions generated from their work. Having accurate numbers is an important step toward making changes, but actually gathering those numbers can be a challenge.

“You can’t improve what you can’t measure,” says Jesse Dodge, a research scientist at the Allen Institute for AI in Seattle. “The first step for us, if we want to make progress on reducing emissions, is we have to get a good measurement.”

To that end, the Allen Institute recently collaborated with Microsoft, the AI company Hugging Face, and three universities to create a tool that measures the electricity usage of any machine-learning program that runs on Azure, Microsoft’s cloud service. With it, Azure users building new models can view the total electricity consumed by graphics processing units (GPUs)—computer chips specialized for running calculations in parallel—during every phase of their project, from selecting a model to training it and putting it to use. It’s the first major cloud provider to give users access to information about the energy impact of their machine-learning programs. 

While tools already exist that measure energy use and emissions from machine-learning algorithms running on local servers, those tools don’t work when researchers use cloud services provided by companies like Microsoft, Amazon, and Google. Those services don’t give users direct visibility into the GPU, CPU, and memory resources their activities consume—and the existing tools, like Carbontracker, Experiment Tracker, EnergyVis, and CodeCarbon, need those values in order to provide accurate estimates.

The new Azure tool, which debuted in October, currently reports energy use, not emissions. So Dodge and other researchers figured out how to map energy use to emissions, and they presented a companion paper on that work at FAccT, a major computer science conference, in late June. Researchers used a service called Watttime to estimate emissions based on the zip codes of cloud servers running 11 machine-learning models.

They found that emissions can be significantly reduced if researchers use servers in specific geographic locations and at certain times of day. Emissions from training small machine-learning models can be reduced up to 80% if the training starts at times when more renewable electricity is available on the grid, while emissions from large models can be reduced over 20% if the training work is paused when renewable electricity is scarce and restarted when it’s more plentiful. 

This Article was first live here.

Continue Reading

Technology

Amazon’s Echo Show 8 Smart Display Hits New All-Time Low Ahead of Prime Day

Published

on

This story is part of Amazon Prime Day, CNET’s guide for everything you need to know and how to find the best deals.

Amazon’s lineup of Echo Show smart displays add a touchscreen to the voice-controlled Alexa experience, and that visual approach to smart home assistance makes for a much better experience when you’re asking for a weather forecast or a recipe to follow along with, or if you want to place a quick video call to a loved one or a co-worker. If you’ve been looking to add one to your smart home setup, then your timing couldn’t be better — Amazon’s Prime Day sale is just days away, and early smart home specials include deals on Echo devices, several of which are available right now at their lowest prices ever.

Among them is the first-gen Echo Show 8, from 2019. Like the name suggests, the marquee feature is an 8-inch HD touchscreen, but we also appreciated the decent sound and camera quality, along with the inclusion of a physical privacy shutter that covers the camera lens when you aren’t using it. Originally priced at $130, the first-gen Echo Show 8 is available now as an early Prime Day deal for just $55, which is half of what Amazon usually asks for it these days and $5 less than its previous all-time low. That’s an outstanding value for an Editors’ Choice-winning smart display.

So what are you missing by going with the first-gen Echo Show 8 instead of the second-gen follow-up? Not much, honestly. Along with a faster processor, the primary upgrade with last year’s second-gen Echo Show 8 is a much-higher megapixel camera that Amazon uses to artificially pan and zoom to follow you within the frame if you’re moving around during a video call. It’s neat, but it’s not a must-have. Same goes for the second-gen display’s built-in Zigbee radio for connecting lights and locks with your network and its person detection feature, which uses motion sensing smarts to trigger Alexa routines when someone walks in the room. They’re neat and they’re nice to have, but you also aren’t missing out on too much by skipping them, especially for value like this.

Your other Alexa-enabled alternatives would be the pint-sized Echo Show 5, which is probably a better fit for a bedroom nightstand thanks to its smaller stature and snooze-able sunrise alarms, or the motorized Echo Show 10, which can automatically pivot the entire touchscreen to follow you around the room during calls, or the wall-mountable Echo Show 15, which is designed more like a king-sized smart picture frame. Even at full price, we like the Echo Show 8 as a good sweet spot selection from that lineup. At $55, it’s a steal.

This Article was first live here.

Continue Reading

Trending