Microsoft Live Labs has just announced and made available their Seadragon mobile application. Oddly it works as a mobile application only on the IPhone for now. Appears to be a very natural connection to their Photosynth idea, which I reviewed previously " ... Seadragon Mobile brings the same smooth image browsing you get on the PC to the mobile platform. Get super-close in on a map or photo, with just a few pinches or taps of your finger. Browse an entire collection of photos from a single screen. You can browse Deep Zoom Images that you can create from your own pictures or..." Trouble is that the Photosynth link does not work at this time, but it is being worked on. See the Seadragon blog and site.
They take care to mention this is just a showcase outline demo of an ultimate system, not even a Alpha, whatever that means now. I installed it and played with their sample data, a DB of global satellite images, map data from the Library of Congress and other sample images. Transitions were smooth, but some of the zooming was slow, likely due to my WiFi connection reloading.
I much like the idea of using an infinite field of images that I can manipulate to study, especially from a mobile device. Could be useful for certain kinds of collaborative design capabilities. For example, comparing real objects and locations and large numbers of stored images. Would be further useful to include tag generation and image analysis capabilities to aid its exploration. Also a way to add a markup layer where I can collaborate with others on the field of image(s).
You can compare this to Google Earth, which also has a mobile viewer that gives fairly smooth drag and follow interaction on the IPhone. Compare it to Tag Galaxy for tagged images.
There is no way as yet to be able to search any of the image meta-data, like tags or title or dates. Quite fundamental. I would imagine that this would be easy to add at some point. So all I can do is browse in two dimensional spaces like maps. Some of the exploration methods, other than zoom in and out, are still primitive. The Photosynth connection, when working will also be interesting to examine for better exploration of recorded spaces.
Suppose we could generate images from our brain and then scan them this way? A backup of our own visual memory?
This is definitely worth a look and following to see how it evolves. There is also a Seadragon Ajax capability that you can place in your own web spaces to provide a viewer and related capabilities. I have not explored these yet.
Scanning just some of the comments, there is the usual cynical criticism and mostly childish poking at Microsoft. Ignore it.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment