Files
ImaageQ_App_Unity6/README.md

368 lines
13 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

# AR Room Scanner & Furnishing — **iOS-first (Unity 6.0 LTS)**
Clean, modular Unity **6.0 LTS** project focused on **iOS (ARKit)** for:
1) **Scanning** a real room on-device (LiDAR Scene Reconstruction when available)
2) **Measuring** in real-world meters
3) **Placing furniture** with collisions and snapping
4) Optional **Object Detection** (kept behind a compile flag)
5) Optional **RoomPlan** integration path (native bridge) for semantic room models
This repo is designed to be stable today and easy to evolve (upgrade to 6.1/6.2 later).
---
## Table of Contents
- [Why iOS](#why-ios)
- [Folder Layout](#folder-layout)
- [Assembly Definitions & Dependencies](#assembly-definitions--dependencies)
- [Packages & Versions](#packages--versions)
- [Device Support Matrix](#device-support-matrix)
- [Getting Started](#getting-started)
- [iOS Build & Xcode Setup](#ios-build--xcode-setup)
- [Scenes & Flow](#scenes--flow)
- [Core Modules](#core-modules)
- [API Integration](#api-integration)
- [Configuration & Environments](#configuration--environments)
- [Version Control (Git, LFS, SmartMerge)](#version-control-git-lfs-smartmerge)
- [Performance Guidelines (iOS/Metal)](#performance-guidelines-iosmetal)
- [RoomPlan (future path)](#roomplan-future-path)
- [Troubleshooting](#troubleshooting)
- [Roadmap](#roadmap)
- [License](#license)
---
## Why iOS
- **Best on-device scanning** via **ARKit Scene Reconstruction** on **LiDAR** devices (dense mesh, stable scale).
- **Consistent tracking & depth** across supported iPhones/iPads.
- Future option: **RoomPlan** (iOS 16+, LiDAR) to produce **semantic, parametric** rooms (walls/doors/windows) with accurate dimensions.
---
## Folder Layout
```
Assets/
_Project/
App/ # app flow & UI
Controllers/
Scenes/
Bootstrap.unity
ScanScene.unity
FurnishScene.unity
UI/
ARRuntime/ # AR runtime features (platform-agnostic via AR Foundation)
Scanning/ # mesh collection, colliders, export hooks
Measurement/ # AB ruler, heights, helpers
Placement/ # raycasts, snapping, overlap checks, physics
Art/ # in-Unity assets
Logos/
Materials/
Models/
Prefabs/
Shaders/
Textures/
Domain/ # pure business/domain (no Unity deps)
Models/
Services/
Infra/ # outside world (API, storage, settings)
Api/ # IFurnitureApi + HttpFurnitureApi + DTOs
Persistence/ # OBJ/GLB export, JSON metadata
Settings/ # ScriptableObjects (ApiConfig, ProjectFeatures)
Detectors/
Null/ # default no-op detector
Lightship/ # stub; compiled only with LIGHTSHIP_ENABLED
Tests/ # EditMode/PlayMode tests
Settings/ # URP & project assets (keep!)
XR/ # added by packages
```
**Keep `Settings/`** (URP pipeline assets & editor links).
A separate top-level `Scans/` folder (outside `Assets/`) is recommended for large exports to avoid re-import churn.
---
## Assembly Definitions & Dependencies
Create one `.asmdef` per root module:
- `Domain` (no references)
- `Infra` → references `Domain`
- `ARRuntime` → references `Domain`
- `App` → references `ARRuntime`, `Infra`, `Domain`
- `Detectors.Lightship` → references `Infra`, `Domain` with **define constraint** `LIGHTSHIP_ENABLED`
- `Tests` → references as needed
**Dependency direction**
`App → (ARRuntime, Infra, Domain)`
`ARRuntime → Domain`
`Infra → Domain`
`Domain` depends on nothing
This keeps compile times low and prevents “upward” coupling.
---
## Packages & Versions
- **Unity**: 6.0 LTS (URP template)
- **AR Foundation**: 6.x
- **ARKit XR Plugin**: 6.x
- (Optional) XR Interaction Toolkit 3.x
- TextMesh Pro (built-in)
> Pin package versions in `Packages/manifest.json` once the project compiles cleanly.
---
## Device Support Matrix
| Capability | Requirement | Notes |
|---|---|---|
| ARKit basic tracking | iPhone 8+ / iOS 13+ (practical: iOS 15+) | Non-LiDAR devices wont produce dense meshes. |
| **Scene Reconstruction (meshing)** | **LiDAR** devices (e.g., iPhone 12 Pro+, 13 Pro+, 14 Pro/Pro Max, 15 Pro/Pro Max; iPad Pro 2020+) | AR Foundation exposes meshes via `ARMeshManager`. |
| Environment Depth / People Occlusion | Device-dependent | Used for occlusion realism; works best on LiDAR. |
| **RoomPlan** (future) | iOS 16+, **LiDAR** | Generates semantic room model via native SDK. |
We gracefully **fall back**: if no LiDAR mesh is available, we still support plane detection + measurements.
---
## Getting Started
1) **Switch to iOS**
`File → Build Settings → iOS → Switch Platform`.
2) **Install packages** (Window → Package Manager)
- AR Foundation 6.x
- ARKit XR Plugin 6.x
- (Optional) XR Interaction Toolkit 3.x
3) **Project Settings (iOS)**
- **Player → iOS**
- Scripting Backend: **IL2CPP**
- Target Architectures: **ARM64**
- Graphics APIs: **Metal** (only)
- Minimum iOS Version: **15+** recommended (RoomPlan needs 16+)
- Camera Usage Description (Info.plist): e.g., _“AR scanning requires camera access.”_
- Photo Library Add Usage Description (if you export files to Photos)
- Motion Usage Description (only if you use CoreMotion; otherwise omit)
- **XR Plug-in Management → iOS**
- Enable **ARKit**
- In ARKit settings, enable **Environment Depth** (and People Occlusion if needed)
- **URP**
- SRP Batcher: **ON**
- MSAA: **2x**
- Mobile-friendly shadows
4) **Scenes**
- Create `Bootstrap.unity`, `ScanScene.unity`, `FurnishScene.unity` under `Assets/_Project/App/Scenes/`.
5) **First run**
- Add `Bootstrap` and `ScanScene` to Build Settings (Bootstrap first).
- Build to Xcode, set signing, run on device.
---
## iOS Build & Xcode Setup
1) **Build in Unity** → generates an Xcode project.
2) **Xcode**
- Select your **Team** & Provisioning profile
- Ensure **Camera** privacy string is present (Unity will add based on Player Settings)
- In “Signing & Capabilities”, you typically **dont** need extra entitlements for ARKit beyond camera; add **Files (iCloud)** only if you export to Files app.
3) **Run on device** (USB).
4) If you see a black camera feed: check Privacy strings, ensure real device (not Simulator), and that `Requires ARKit` is set (Unity: Player → iOS → “Requires ARKit”).
---
## Scenes & Flow
### `Bootstrap.unity`
Minimal loader that switches to the first “real” scene:
```csharp
using UnityEngine;
using UnityEngine.SceneManagement;
public class Bootstrap : MonoBehaviour
{
[SerializeField] string firstScene = "ScanScene";
void Start() => SceneManager.LoadScene(firstScene, LoadSceneMode.Single);
}
```
### `ScanScene.unity` (first milestone)
**Hierarchy (suggested)**
```
AR Session
XR Origin
└─ Camera Offset
└─ Main Camera (tag: MainCamera)
AR Managers (child of XR Origin)
├─ AR Plane Manager
├─ AR Raycast Manager
└─ AR Mesh Manager # must be under XR Origin in ARF 6
RoomMesh (MeshFilter + MeshRenderer + MeshCollider)
RoomScanner (script) # combines AR chunks into RoomMesh
MeasureLine (LineRenderer)
MeasureTool (script) # tap A→B distances in meters
```
On the **Main Camera** (child of XR Origin): add **AR Camera Manager** and **AR Occlusion Manager**.
### `FurnishScene.unity` (next milestone)
- Placement raycasts and snapping to **floor/walls**
- Overlap checks / colliders to prevent interpenetration
- Occlusion enabled for realism
---
## Core Modules
### `ARRuntime/Scanning` — RoomScanner
- Polls `ARMeshManager.meshes` and **combines** all chunks into a single mesh every N seconds.
- Assigns that mesh to `RoomMesh`s `MeshFilter` and `MeshCollider`.
- The combined mesh is in **meters** (Unity units = meters), so measuring is straightforward.
### `ARRuntime/Measurement` — MeasureTool
- Touch once = point A, touch again = point B.
- Draws a line (`LineRenderer`) and shows **meters** to 2 decimals.
- Uses `Physics.Raycast` against the combined collider or against detected planes.
### `ARRuntime/Placement`
- Raycast from screen to floor/wall planes or to the combined mesh.
- Snap by projecting onto a planes normal.
- Prevent collisions with `Physics.OverlapBox/Sphere` before placing.
---
## API Integration
Interface-first, so the app logic doesnt depend on a concrete client:
```csharp
// Infra/Api/IFurnitureApi.cs
public interface IFurnitureApi
{
Task<IReadOnlyList<Furniture>> GetVariantsAsync(IEnumerable<string> ids);
Task<IReadOnlyList<Furniture>> SearchAsync(string query, int page = 1, int pageSize = 20);
}
```
```csharp
// Infra/Api/HttpFurnitureApi.cs
public sealed class HttpFurnitureApi : IFurnitureApi
{
private readonly HttpClient _http;
private readonly ApiConfig _cfg;
public HttpFurnitureApi(HttpClient http, ApiConfig cfg) { _http = http; _cfg = cfg; }
public async Task<IReadOnlyList<Furniture>> GetVariantsAsync(IEnumerable<string> ids)
{
var url = $"{_cfg.BaseUrl}/api/v1/FurnitureVariant/GetByIds";
using var resp = await _http.PostAsJsonAsync(url, ids);
resp.EnsureSuccessStatusCode();
return await resp.Content.ReadFromJsonAsync<List<Furniture>>() ?? new();
}
public async Task<IReadOnlyList<Furniture>> SearchAsync(string query, int page = 1, int pageSize = 20)
{
var url = $"{_cfg.BaseUrl}/api/v1/FurnitureVariant/Search?query={Uri.EscapeDataString(query)}&page={page}&pageSize={pageSize}";
return await _http.GetFromJsonAsync<List<Furniture>>(url) ?? new();
}
}
```
**Config**: `Infra/Settings/ApiConfig` (ScriptableObject) with `BaseUrl`, timeouts, and environment selectors (DEV/QA/PROD).
---
## Configuration & Environments
Create a `ProjectFeatures` ScriptableObject (in `Infra/Settings/`) with toggles:
- `useMeshing` (on by default)
- `useOcclusion` (environment/people)
- `useObjectDetection` (off; Lightship behind `LIGHTSHIP_ENABLED`)
- `enableExports` (to write OBJ/GLB to app storage)
This lets QA test different combinations without code changes.
---
## Version Control (Git, LFS, SmartMerge)
- Keep **text/YAML** in Git (scenes, prefabs, materials, `.meta`, `ProjectSettings/`, `Packages/`).
- Track **large binaries** (GLB/OBJ/FBX/PSDs/EXR/8K textures) with **Git LFS**.
- Enable **Unity Smart Merge** for clean scene/prefab merges.
(We also recommend a top-level `Scans/` folder, LFS-tracked, for big room exports.)
---
## Performance Guidelines (iOS/Metal)
- **Metal only**, IL2CPP, ARM64.
- URP: **SRP Batcher ON**, **MSAA 2×**, mobile shadows.
- Avoid per-frame allocations; reuse buffers.
- Combine mesh at **intervals** (12 s) rather than every frame.
- Update `MeshCollider.sharedMesh` **only when merged mesh changes** to avoid spikes.
- Consider decimation for very large meshes if triangle count exceeds target thresholds.
---
## RoomPlan (future path)
If you need clean, semantic floorplans and furniture categories:
- Implement a **native iOS plugin** (Swift/Obj-C) that runs RoomPlan (iOS 16+, LiDAR).
- Export **USDZ/USDA/OBJ/GLTF** or RoomPlan **JSON**; import into Unity via `Infra/Persistence`.
- Provide an adapter `IRoomImporter` so `App` can switch between **Scene Reconstruction mesh** and **RoomPlan semantic model** at runtime/build time.
Keep all RoomPlan code behind a **`ROOMPLAN_ENABLED`** define if you prefer the same pattern as Lightship.
---
## Troubleshooting
- **Popup: “An ARMeshManager must be a child of an XROrigin.”**
Move `ARMeshManager` under **XR Origin** (not on the same GO).
- **Black camera feed**
Real device only (no Simulator), **Camera** usage string present, `Requires ARKit` ticked, provisioning OK.
- **No mesh appears**
Device may not have LiDAR; fall back to planes & depth. Ensure ARKit Scene Reconstruction is supported on the test device.
- **Raycast doesnt hit**
Ensure `RoomMesh` has a **MeshCollider** and that you merged at least once. Check layers.
- **Build fails in Xcode**
Clear Derived Data, check signing, ensure Metal is the only graphics API.
---
## Roadmap
- ✅ iOS-first scanning (Scene Reconstruction), measuring, placement skeleton
- ⏳ Exporters (OBJ/GLB + Draco), thumbnails, metadata (`units`, `bbox`, triangle count)
- ⏳ Furniture placement UX (snapping gizmos, grid, rotation/align to wall)
- ⏳ Semantic planes (wall/floor/ceiling) classification helpers
- ⏳ RoomPlan native bridge (optional feature flag)
- ⏳ Lightship detection re-enable (compile flag)
---
## License
TBD — choose a license that suits your distribution model (MIT/Apache-2.0/Proprietary).
---
### Quick Start (TL;DR)
1. Open in Unity **6.0 LTS**, switch to **iOS**.
2. Install **AR Foundation 6.x** + **ARKit XR Plugin 6.x**.
3. Player: **IL2CPP / ARM64 / Metal**, iOS 15+, Camera privacy string.
4. XR Plug-in Management: **ARKit ON**, enable Environment Depth.
5. Open **ScanScene** → Run on a **LiDAR** device → tap two points to measure in meters.
6. Move to **FurnishScene** for placement once scanning feels good.