Update README.md
Browse files
README.md
CHANGED
|
@@ -18,6 +18,10 @@ license_link: LICENSE
|
|
| 18 |
## **Overview**
|
| 19 |
**OmniNeural** is the first fully multimodal model designed specifically for Neural Processing Units (NPUs). It natively understands **text, images, and audio**, and runs across PCs, mobile devices, automobile, IoT, and robotics.
|
| 20 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
---
|
| 22 |
|
| 23 |
## **Key Features**
|
|
@@ -30,22 +34,6 @@ license_link: LICENSE
|
|
| 30 |
|
| 31 |
---
|
| 32 |
|
| 33 |
-
## **Use Cases**
|
| 34 |
-
|
| 35 |
-
- **PC & Mobile** – On-device AI agents combine **voice, vision, and text** for natural, accurate responses.
|
| 36 |
-
- Examples: Summarize slides into an email (PC)*, *extract action items from chat (mobile).
|
| 37 |
-
- Benefits: Private, offline, battery-efficient.
|
| 38 |
-
|
| 39 |
-
- **Automotive** – In-car assistants handle **voice control, cabin safety, and environment awareness**.
|
| 40 |
-
- Examples: Detects risks (child unbuckled, pet left, loose objects) and road conditions (fog, construction).
|
| 41 |
-
- Benefits: Decisions run locally in milliseconds.
|
| 42 |
-
|
| 43 |
-
- **IoT & Robotics** – Multimodal sensing for **factories, AR/VR, drones, and robots**.
|
| 44 |
-
- Examples: Defect detection, technician overlays, hazard spotting mid-flight, natural robot interaction.
|
| 45 |
-
- Benefits: Works without network connectivity.
|
| 46 |
-
|
| 47 |
-
---
|
| 48 |
-
|
| 49 |
## **Performance / Benchmarks**
|
| 50 |
### Human Evaluation (vs baselines)
|
| 51 |
- **Vision**: Wins/ties in ~75% of prompts against Apple Foundation, Gemma-3n-E4B, Qwen2.5-Omni-3B.
|
|
@@ -81,6 +69,22 @@ OmniNeural’s design is tightly coupled with NPU hardware:
|
|
| 81 |
|
| 82 |
---
|
| 83 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 84 |
## **How to use** //TODO
|
| 85 |
|
| 86 |
> ⚠️ Note: OmniNeural currently runs on Qualcomm NPUs (Snapdragon devices).
|
|
|
|
| 18 |
## **Overview**
|
| 19 |
**OmniNeural** is the first fully multimodal model designed specifically for Neural Processing Units (NPUs). It natively understands **text, images, and audio**, and runs across PCs, mobile devices, automobile, IoT, and robotics.
|
| 20 |
|
| 21 |
+
### **Demo running on Snapdragon NPU in Samsung S25 Ultra.**
|
| 22 |
+
A fully local, multimodal conversational AI assistant that hears you and sees what you see is finally possible. And it runs on NPU, keeping the battery life long.
|
| 23 |
+
|
| 24 |
+
|
| 25 |
---
|
| 26 |
|
| 27 |
## **Key Features**
|
|
|
|
| 34 |
|
| 35 |
---
|
| 36 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
## **Performance / Benchmarks**
|
| 38 |
### Human Evaluation (vs baselines)
|
| 39 |
- **Vision**: Wins/ties in ~75% of prompts against Apple Foundation, Gemma-3n-E4B, Qwen2.5-Omni-3B.
|
|
|
|
| 69 |
|
| 70 |
---
|
| 71 |
|
| 72 |
+
## **Production Use Cases**
|
| 73 |
+
|
| 74 |
+
- **PC & Mobile** – On-device AI agents combine **voice, vision, and text** for natural, accurate responses.
|
| 75 |
+
- Examples: Summarize slides into an email (PC)*, *extract action items from chat (mobile).
|
| 76 |
+
- Benefits: Private, offline, battery-efficient.
|
| 77 |
+
|
| 78 |
+
- **Automotive** – In-car assistants handle **voice control, cabin safety, and environment awareness**.
|
| 79 |
+
- Examples: Detects risks (child unbuckled, pet left, loose objects) and road conditions (fog, construction).
|
| 80 |
+
- Benefits: Decisions run locally in milliseconds.
|
| 81 |
+
|
| 82 |
+
- **IoT & Robotics** – Multimodal sensing for **factories, AR/VR, drones, and robots**.
|
| 83 |
+
- Examples: Defect detection, technician overlays, hazard spotting mid-flight, natural robot interaction.
|
| 84 |
+
- Benefits: Works without network connectivity.
|
| 85 |
+
|
| 86 |
+
---
|
| 87 |
+
|
| 88 |
## **How to use** //TODO
|
| 89 |
|
| 90 |
> ⚠️ Note: OmniNeural currently runs on Qualcomm NPUs (Snapdragon devices).
|