Valkyrie Arline Savage, PhD - 3/3/2013 17:10:59
This weeks’ readings were all about displays. Mobile ones, in general.
The Illumishare paper described a mobile version of the Digital Desk. There was nothing new technically in this paper (although they seemed to think there was), but their focus was instead on how such a tool could be used to foster play between children. I found it difficult to understand their language in several places (regarding the multiplexing they do to remove visual echoes without flicker, also what does “empowering” transitions mean?, what are the “digital objects” that participants were sharing?), but their general ideas were sound. One thing I loved about this work is that it actually went back and revisited old work. The authors cite a paper which seemed to contradict all other papers in the children/play arena and which stated that children didn’t like having a shared task-space. They wanted to show that this work was just poorly done, and actually children do like that. I haven’t really seen any work that reconsiders old results, so that was awesome.
The SideBySide paper describes a pretty neat modified-handheld-projector setup which allows multiple such projectors to talk to each other in any space using invisibly projected IR QR codes. They didn’t really do any user-testing (despite many claims about how “seamless” the interaction is and “easy to hold” the projectors are), or evaluation of any kind (how often are markers correctly identified? how much light can be in a room?). That was sort of annoying. They did have a strong focus on how the base components of their system are feasible to manufacture likeritenao, which is something that isn’t common in research papers. They offered solutions to basically all the pitfalls of their approach, except the fact that it’s not very sunlight-compatible. As Disney, I would expect that they might have an interest in making this a real-world product, for which they would need it to be actually robust to real-world conditions (another feature that they emphasized designing for but which they didn’t evaluate).
MouseLight was a very confusing paper. The writing was just awful, and I didn’t understand all the things that they were talking about (what is displaced copy and paste?). They did a reasonably good job of evaluation, in that they had a professional designer (architect) actually evaluate their project on a real task. However, they also spent a lot of time talking about what college students thought of the project, which didn’t align at all with the comments of the designer (they liked copy/paste, and he liked the projected French curves). The idea of having an overlay on a sheet which shares more information is good, but has been explored elsewhere. It sounds like their goal was to make a portable and easy-to-use projector that would encourage bimanual interaction, but they came up somewhat short here in that many of their evaluators said the prototype was so bulky as to discourage bimanual interactions.
arie meir - 3/4/2013 22:53:25
Mouselight provides a rich dynamic visual feedback mechanism by trying to bridge the physical and the digital worlds. By decoupling the input and the output channels, MouseLight creates a new space for possible augmented interactions with physical objects like paper, surfaces e.t.c. The abstract operations such as copy and paste pose new interesting questions as to the meaning of those high level semantics in new interaction modalities. Should the designer impose some predefined functionality on the user or alternatively - how would the user choose to define the meaning of new operations. The spatial awareness of the output channel allows for extended richness of the augmented features. I imagine the device to be somewhat bulky and cumbersome since it occupies both hands. Applications that don't require keyboard input might be a good fit for it. i
The illumishare work presents a task-reference space sharing mechanism that allows sharing physical and digital objects on virtually arbitrary surfaces. The experimental study involving children is interesting as it allows one to see more "natural" responses less rooted in the long-acquired habits of doing things a certain way. I think that overall any HCI study would benefit from an experimental phase involving younger population. The ability of the device to share an arbitrary surface is interesting but has to be reality tested - what would be a killer app for this kind of interaction ?
The sideByside system is cool due to its dynamic nature : in an adhoc-fashion users can interact with each other on a shared space by tracking projected images using projected fiducial markers in IR spectrum. In comparison to the previous system, this time the users share experience through projected images on a shared surface. By measuring the angle and the distance between the projections, dynamic interaction behavior scenarios can be desgined. The technical implementation of the device looks impressive - high level of integration and detail. The work lacks experimental study which leaves one wondering what would real users have to say ?
Overall it seems that the unifying motive between these 3 papers is using micro-projector in order to create rich feedback devices, allowing integration of physical and digital domains. The three works are focused on different applications and have different limitations but the idea of blurring the gap between our physical reality and our in silico life keeps being pushed forwad.
Sean Chen - 3/5/2013 0:37:54
IllumiShare is a device that helps remote collaboration. By using cameras and projectors that display as well as record ones work space, users can arbitrarily share physical and digital content and communicate the task and reference space well.
Current remote conference tools are efficient in terms of sharing faces and digital contents. But when we need to brainstorm or sketch out our ideas, it's often troublesome. This method utilize the paper affordance and can be really useful for conveying ideas. In addition, it can also project digital content. Previously, when multiple people want to annotate on the same digital content, we had to print it out so that we don't have to take turn of the mouse. This also solves such problem. The demo video showed a few compelling use cases.
The hybrid of projecting and taking pictures reminds me of the SecondLight which also perform two tasks and switch them fast enough so users would not notice.
This paper did a quite thorough user study comparing to the others we read. However, it didn't explain why they choose children. I also wonder how it works if we have 3 people in the team. Will the number of images required to be transferred and processed affect its performance?
SideBySide is a handheld projector that sends and receives visible and infrared lights. The devices can interact with each other by detecting infrared markers nearby. Different markers represent different messages and a variety of applications and games can be performed on arbitrary surfaces.
This design does not require any environmental settings, except that outdoor sunlight might cause noises. It needs not a display system, and multiple devices can be interacting with each other without any setup since it doesn't use wires or radio to communicate. In the previous papers including Codex, when they discuss collaboration, most of them were designed for two people only and doesn't really take into consideration that there might be a third person in the team. What I like about this design is that the number of users is not limited to just two.
The application, however, seems fairly plain. Since most of the fingers are used to hold the projectors, and due to the direction the projector has to be pointed, hand and finger movements are pretty limited. I wonder if it the projector is designed to be attached on a long glove with sensors on fingers, perhaps users can have more types of interaction with the system to do more complicated tasks.
MouseLight is a mobile projector that can interact with users papers. The designed pen can simultaneously write on real paper and control the digital system. The projector can provide an extra layer of information. And the user can control the system by using the pen to interact with the projected buttons.
This seems like a smaller version of IllumiShare to me. They both try to project digital content on real papers to get the best of two worlds. However, the area which MouseLight is able to project is fairly limited, perhaps in the tradeoff of portability. But I wouldn't want to having move around the projector all the time just to see different area of the paper.
I like some of the interaction design of the menu. They seems pretty intuitive and easy to use. However, it makes me wonder that why not use other ways to perform the tasks. Since in this design, you need the digital data stored in the system, it seems easier to use a Wacom tablet or Digital Desk to directly interact with the digital data. In the video demo, when the user was trying to 'move a table', it was weird that the table was still on the real paper. In addition, there might be still some inaccurate projection due to the projecting angle. If there are lots of small information, it would look a bit messy.
Having said that, I do think the second scenario - the drafting and measuring tool is useful comparing trying to do AR. We know that paper prototype is a great tool and that certain affordances on the paper are hard to being mimicked by virtual systems. This tool can provide templates and components that accelerate paper prototype while making it more precise.
David Burnett - 3/5/2013 1:13:06
"IllumiShare: Sharing Any Surface"
This project uses a projector and camera to share a physical task space across a telepresence/teleconference session. This space can be physically and digitally modified and all such changes will be transmitted to the other communicating party.
Other interactive surfaces exist and are available commercially, but those systems include two drawbacks. First, they are an order of magnitude more expensive than the system described by this paper. Second, they cannot be physically modified; all modifications must be performed in the digital domain.
The IllumiSpace offers an intuitive shared space in a way pen-based tablet systems or others never have. It enables all the ways pen and paper in a face-to-face meeting are natural tools to transmit ideas between people. This paper spent significant effort proving this point by performing rigorous user studies.
A potential downside is that the resolution and brightness used in the system as implemented may lead to diffiult use of the space for high precision reading/working, and makes it suitable for only certain lighting situations.
The project is currently not automatically configured, though this is repairable in future versions. Other such systems project fiducial markers to provide distance, angle, and orientation; IllumiSpace needs such a feature to gain more widespread adoption and the refinement that comes with such proliferation.
This project allows two users of handheld projectors to project images that are capable of detecting where the other projector's image is and, depending on the current program, interact.
The concept of two projected images interacting is something children play with in situations as simple as flashlight beams, so the interaction technique is something highly intuitive. This concept was not tested, but it may be so obvious that no such user testing is necessary.
In an age where the 2.4GHz ISM band is fought for between Wifi, Bluetooth, ZigBee, cordless phones and mics, and more, it's refreshing to see a data transfer system that uses a different carrier. SideBySide transfers data through patterned IR reflection, which is immune to all those interferents. Even IR keyboards haven't been used in over a decade.
The trouble with IR is that it's a light source, so any other bright source can wash it out. Dark spaces are something that we perceive and can create easily, but the interactive nature of this system won't work in bright areas.
Lastly, it's undeniable that such an interactivity system is very different. This opens the door to rich new games and collaborative efforts, but potentially with a long lead time as developers understand how best to utilize such a unique system.
"MouseLight: Bimanual Interactions on Digital Paper using a Digital Pen and Spatially-aware Mobile Projector"
This project allows users to use a pen in one hand to interact with an image projected from their other hand, allowing one to interact with an augmented paper surface. It is, in effect, IllumiSpace except the other user is a computer instead of a person.
With the projector and relevant interactions in one hand and the pen in the other, time-costly and interrupting hand reposition (as from mouse to keyboard and back) is avoided. In addition, performing a fine operation with one hand on an object in the other hand is a long-established mode of human tool manipulation.
The pen is a tool most humans have fine control over, and interacting with a digital display with pen instead of mouse adds another degree of accuracy and precision. Furthermore, interacting with projection on real paper instead of a tablet surface leaps to the holy grail of pen-based input devices: the correct coefficient of friction for truly natural pen input.
Mouse and pen operations are typically both carried out by one's dominant hand, so mouse operations would need to be severely simplified to still be usable by most. As a result, this system won't augment traditional mouse and keyboard, and thus won't easily integrate into current computer work.
Projection of alternative layers or viewpoints of an image, printed or digital, is often the first proof of concept of a multi-layer display system. Given the unique interactive nature of this device and that such layered projection has been available for a very long time, projection that takes advantage of this specific combination should be created.
Joey Greenspun - 3/5/2013 8:38:11
elliot nahman - 3/5/2013 9:33:49
Passing on this one