Log in
Enquire now
Luma AI

Luma AI

Luma AI is a developer of photorealistic 3D image capture software designed to enable users to capture 3D-viewable photos of scenes and objects on smartphones.

OverviewStructured DataIssuesContributors

Contents

lumalabs.ai
Is a
Company
Company
Organization
Organization

Company attributes

Industry
Technology
Technology
‌
AI design generation
Generative AI
Generative AI
Computer Vision
Computer Vision
Artificial Intelligence (AI)
Artificial Intelligence (AI)
3D
3D
Synthetic Media
Synthetic Media
Virtual reality
Virtual reality
...
Location
San Francisco
San Francisco
0
CEO
Amit Jain
Amit Jain
0
Founder
Alberto Taiuti
Alberto Taiuti
0
Amit Jain
Amit Jain
0
Alex Yu
Alex Yu
0
Pitchbook URL
pitchbook.com/profiles...483378-04
Legal Name
Luma AI, Inc.
Number of Employees (Ranges)
11 – 500
Email Address
support@lumalabs.ai0
Number of Employees
13
Investors
Matrix Partners
Matrix Partners
NVIDIA
NVIDIA
General Catalyst
General Catalyst
Amplify Partners
Amplify Partners
South Park Commons
South Park Commons
Andreas Klinger
Andreas Klinger
Context Ventures
Context Ventures
Andreessen Horowitz (a16z)
Andreessen Horowitz (a16z)
Founded Date
2021
0
Total Funding Amount (USD)
67,300,000
Latest Funding Round Date
January 9, 2024
Competitors
Scenario
Scenario
0
Mirage
Mirage
0
3Dfy
3Dfy
0
‌
Kaedim
0
Business Model
Commerce
Latest Funding Type
Series B
Series B

Other attributes

Latest Funding Round Amount (USD)
43,000,000
Overview

Luma AI develops photorealistic 3D capture software for users to capture photos of scenes and objects viewable in 3D on smartphones. Luma AI uses advances in neural rendering and deep learning with the greater availability of compute for photorealistic 3D capture. Luma AI targets a number of markets, including e-commerce, real estate, and 3D games industries. The company also offers Genie, a multimodal text-to-3d model capable of creating any 3d object in under ten seconds with materials, quad mesh retopology, and variable polycount.

Luma AI tutorial

Headquartered in San Francisco, Luma AI was founded in 2021 by Amit Jain, Alberto Taiuti, and Alex Yu. Tauiti and Jain were previously Apple employees, and Yu was an AI researcher at UC Berkeley at the time of the company's founding. In October 2021, Luma raised $4.3 million in seed funding from Matrix Partners, South Park Commons, Amplify Partners, RFC’s Andreas Klinger, Context Ventures, and a group of angel investors. In March 2023, the company announced a $20 million Series A round led by Amplify Partners with participation from NVIDIA (NVentures), General Catalyst, and existing investors who participated in the seed round. In January 2024, Luma raised $43 million in a series B round with participation from Andreessen Horowitz, Amplify Partners, Matrix Partners, NVIDIA, South Park Commons, and a group of angel investors. Sources stated the funding values Luma between $200 million and $300 million.

Capture process and best practices
Capture best practices

Motion blur negatively impacts the quality of the digital render. Therefore, Luma advises users to move the phone slowly and avoid rapid rotatory movements. For best results, it is recommended that the object or scene be captured from as many unique angles as feasible. Additionally, move the device around rather than rotate it from a stationary position during capturing. Remaining in the same place and capturing outwards in a spherical motion typically does not work well. The guided capture mode helps users achieve sufficient coverage of the object or scene to be rendered.

For guided captures, any object that can be easily viewed from all angles (including the top and bottom) is suitable. For free-form captures, any object is suitable, but more coverage yields superior results; therefore, larger objects may be problematic. To further improve the accuracy of the reconstruction, the entire object must remain framed while it is being scanned, which provides the app with more reflection and object shape data.

Technical challenges

The app can have difficulties with handling complex reflections (e.g. curved mirror-like surfaces), curved transparent objects (e.g. plastic water bottles), and very large textureless surfaces (e.g. white walls). The app can capture objects in most lighting conditions as long as textures are not too bright or too dark and remain identifiable. Lighting conditions will be baked in, so the scene should be lit however the user would like it to appear in the final result.

In general, any movement in the scene during capture may degrade the quality of the render. For instance, tree leaves moving in the wind or people moving in the background may result in a loss of detail. The software is not compatible with video stabilization as it causes the frames to have unstable camera intrinsics, which is particularly acute on Android devices. The HDR video option on iOS may also cause artifacts. Luma generally recommends using fixed exposure, although variable exposure can work well in outdoor scenes with varying lighting conditions.

Uploading footage

Users have the option of uploading raw images by uploading zips of sequential photos instead of videos through the Luma web interface. Photos are often higher quality than videos and can be preferable if the highest render quality is desired.

Timeline

No Timeline data yet.

Funding Rounds

Products

Acquisitions

SBIR/STTR Awards

Patents

Further Resources

Title
Author
Link
Type
Date
No Further Resources data yet.

References

Find more companies like Luma AI

Use the Golden Query Tool to find similar companies in the same industry, location, or by any other field in the Knowledge Graph.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.