Imagine being able to ask AI about what’s happening right now in your physical environment. Jaime Lien, Founder & Head of Hardware and Signal Processing from Archetype AI, explains:
– Archetype AI is developing a generative AI foundational model that translates sensor data into a semantic and natural language format. This allows anyone to understand and interact with complex sensor data.
– Their demo showcases how their model can interpret data from a radar sensor developed by Infineon, providing insights about the physical world around it. Users can ask high-level questions and receive responses about detected objects, their locations, and movement patterns.
– The model is multimodal, meaning it can handle data from various sensors like radar, camera, audio, etc. It maps these different modalities to a common embedding space, allowing for data fusion and providing insights that humans can easily understand.
This innovative approach to sensor data interpretation opens up new possibilities for monitoring and alerting systems, making them instantly accessible and available to anyone without any coding or programming knowledge.
Filmed at Infineon’s #Oktobertech event in Silicon Valley.