Last week Google I/O took place and there were a lot of exciting announcements, ideas and releases! I’d like to shine some light on two of the most visionary ones from Google’s ATAP (Advanced Technology and Projects) group: Project Jacquard and Project Soli.
Looking at how we input information and interact with devices and wearables, and how we can utilize movements and behaviors that are natural to us, Project Soli is developing a new interaction sensor using radar technology. They evaluated the options for tracking movements, overlapping objects and various spectrums and decided that radar met their requirements and needs.
On the radar sensor:
The sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale and built into small devices and everyday objects.
The chip works within the 60 Ghz spectrum and can capture motions at up to 10,000 frames per second.
Watch the video to learn more!
Project Jacquard is focusing on using touch and gesture interactivity into any textile, so everyday objects – clothes and furniture – can be turned into interactive surfaces. This is achieved through conductive yarns, combining metallic alloys with natural and synthetic yarns.
This means that areas on the cloth can have areas sensitive to touch and gestures, and then embedding complementary components like connectors and circuits. Then touch and gestures will be connected to mobile phones or or other devices to control actions directly from the wearable. At Google I/O, Levi’s announced a partnership with Google on this.
More in the video:
The ATAP presentation and projects
I strongly recommend watching the ATAP presentation on all the things they are working on, like: